Company:
Aloden LLC
Location: Charlotte
Closing Date: 19/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Java Developer with Hadoop Only W2 (US Citizen,Greencard Holder,TN Visa,GC EAD,H4-EAD)
Location: Charlotte, North Carolina (Hybrid)
Candidate Preference: Local to Charlotte, NC or nearby areas.
Must-Have Skills & Experience:
Apache Spark: 3+ years of design and development experience using Scala or Java (Java preferred).
Big Data/Hadoop: 5+ years of experience working with Big Data technologies and Hadoop ecosystem tools like Spark and Hive.
Data Warehousing: 3+ years of experience in end-to-end design and delivery of data warehouse applications.
Agile: 2+ years of experience working in an Agile Scrum environment.
Nice-to-Have Skills:
Finance/Accounting: Prior experience working on Finance or Accounting applications.
Responsibilities:
Design, develop, and implement data processing solutions using Apache Spark and Java.
Build and maintain data pipelines within the Hadoop ecosystem.
Contribute to the development and maintenance of data warehouse applications.
Work in an Agile Scrum environment, collaborating with cross-functional teams.
Write clean, efficient, and well-documented code.
Troubleshoot and resolve technical issues related to data processing and storage
Java Developer with Hadoop Only W2 (US Citizen,Greencard Holder,TN Visa,GC EAD,H4-EAD)
Location: Charlotte, North Carolina (Hybrid)
Candidate Preference: Local to Charlotte, NC or nearby areas.
Must-Have Skills & Experience:
Apache Spark: 3+ years of design and development experience using Scala or Java (Java preferred).
Big Data/Hadoop: 5+ years of experience working with Big Data technologies and Hadoop ecosystem tools like Spark and Hive.
Data Warehousing: 3+ years of experience in end-to-end design and delivery of data warehouse applications.
Agile: 2+ years of experience working in an Agile Scrum environment.
Nice-to-Have Skills:
Finance/Accounting: Prior experience working on Finance or Accounting applications.
Responsibilities:
Design, develop, and implement data processing solutions using Apache Spark and Java.
Build and maintain data pipelines within the Hadoop ecosystem.
Contribute to the development and maintenance of data warehouse applications.
Work in an Agile Scrum environment, collaborating with cross-functional teams.
Write clean, efficient, and well-documented code.
Troubleshoot and resolve technical issues related to data processing and storage
Share this job
Useful Links