Company:
Axelon Services Corporation
Location: Broadway Junction
Closing Date: 28/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Job Title: Scala Spark Lead (Hybrid - 3 days onsite)
Location: New York, NY
Pay: $50-55/hr
* Please include Linkedin on your Resume *
Mandatory Skills:
Big Data, Scala, Spark, Core Java
BigData - 7+
Scala - 8+
Spark - 7+
Core Java - 5+
Description:
You will play a critical role in the design, development, deployment and optimization of data processing application.
Responsibilities:
Develop and maintain data processing applications using Spark and Scala.
Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
Implement test-driven deployment practices to enhance the reliability of application.
Deploy artifacts from lower to higher environment ensuring smooth transition.
Troubleshoot and debug Spark performance issues to ensure optimal data processing.
Work in an agile environment, contributing to sprint planning, development and delivering high quality solutions on time.
Provide essential support for production batches, addressing issues and providing fix to meet critical business needs.
Qualifications:
10 years of experience
Strong knowledge of Scala programming language
Excellent problem-solving and analytical skills.
Proficiency in Spark, including the development and optimization of Spark applications.
Ability to troubleshoot and debug performance issues in Spark.
Understanding of design patterns and data structure for efficient data processing
Familiarity with database concepts and SQL * Java and Snowflake (Good to have).
Experience with test-driven deployment practices (Good to have).
Familiarity with Python (Good to have).
Knowledge of Databricks (Good to have).
Understanding of DevOps practices (Good to have).
Location: New York, NY
Pay: $50-55/hr
* Please include Linkedin on your Resume *
Mandatory Skills:
Big Data, Scala, Spark, Core Java
BigData - 7+
Scala - 8+
Spark - 7+
Core Java - 5+
Description:
You will play a critical role in the design, development, deployment and optimization of data processing application.
Responsibilities:
Develop and maintain data processing applications using Spark and Scala.
Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
Implement test-driven deployment practices to enhance the reliability of application.
Deploy artifacts from lower to higher environment ensuring smooth transition.
Troubleshoot and debug Spark performance issues to ensure optimal data processing.
Work in an agile environment, contributing to sprint planning, development and delivering high quality solutions on time.
Provide essential support for production batches, addressing issues and providing fix to meet critical business needs.
Qualifications:
10 years of experience
Strong knowledge of Scala programming language
Excellent problem-solving and analytical skills.
Proficiency in Spark, including the development and optimization of Spark applications.
Ability to troubleshoot and debug performance issues in Spark.
Understanding of design patterns and data structure for efficient data processing
Familiarity with database concepts and SQL * Java and Snowflake (Good to have).
Experience with test-driven deployment practices (Good to have).
Familiarity with Python (Good to have).
Knowledge of Databricks (Good to have).
Understanding of DevOps practices (Good to have).
Share this job
Useful Links