Mid-level Hadoop Developer - Scala/Spark (Remote)
Information Technology company
Information Technology company
W-2 onlyUS ResidentContract508 views
Jacksonville, FL 32256
Contract
Questions?
Click to chat now!
This posting has been closed |
Seeking a Hadoop Developer with Scala, Spark and Kafka skills for 100% Remote position working with Healthcare Big Data.
Responsibilities
- Write code for moderately complex system designs. Write programs that span platforms. Code and/or create Application Programming Interfaces (APIs).
- Write code for enhancing existing programs or developing new programs.
- Review code developed by other IT Developers.
- Provide input to and drive programming standards.
- Write detailed technical specifications for subsystems. Identify integration points.
- Report missing elements found in system and functional requirements and explain impacts on subsystem to team members.
- Consult with other IT Developers, Business Analysts, Systems Analysts, Project Managers and vendors.
- “Scope” time, resources, etc., required to complete programming projects. Seek review from other IT Developers, Business Analysts, Systems Analysts or Project Managers on estimates.
- Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.
- Supports transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.
- Researches vendor products / alternatives. Conducts vendor product gap analysis / comparison.
- Accountable for including IT Controls and following standard corporate practices to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application.
- The essential functions listed represent the major duties of this role, additional duties may be assigned.
Tech stack
- Hadoop
- Spark
- Hive (nice to have)
- HBase (nice to have)
- Mongo (nice to have)
- Kafka (nice to have)
- Experience working with large volumes of data (nice to have)
Education
- Bachelor's degree within IT field or equivalent related work experience
Requirements
- 3-5 years related work experience or equivalent combination of transferable experience and education
- IT development/programming/coding professional within a Hadoop environment