PowerToFly
Recent searches
  • Events
  • Companies
  • Resources
  • Log in
    Don’t have an account? Sign up
Filters
Clear All
Advanced filters
Job type
  • Reset Show results
Date posted
  • Reset Show results
Experience level
  • Reset Show results
Company
  • Reset Show results
Skills
  • Reset Show results
Clear All
Cancel Show Results
Active filters:
Results 4921 Jobs

Wondering why you’re not getting hired?

Take our 3-min quiz and find out!

  • See what’s holding you back
  • Know exactly what to fix
  • Get a plan to move forward
Take the Quiz!
Loading...
Loading more jobs...

No more jobs to load

No more jobs to load

Python Data Engineer - Engineer Intmd Analyst - C11 - CHENNAI
Save Job
Citi

Python Data Engineer - Engineer Intmd Analyst - C11 - CHENNAI

Onsite Chennai, India
Posted 4 hours ago
Save Job

Watch this video to learn more about Citi

Job Details

The Engineer Intmd Analyst is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of software and infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks.

Responsibilities:

  • Design, develop, and optimize scalable data pipelines and ETL/ELT processes using Apache Spark (preferably with Scala or Python) to ingest, transform, and load large datasets from diverse sources.
  • Write, optimize, and troubleshoot complex SQL queries, stored procedures, and functions for data extraction, transformation, and reporting within relational and analytical databases.
  • Develop and maintain data models, schema definitions, and database objects in various data storage solutions (e.g., data warehouses, data lakes, operational databases).
  • Ensure data quality, integrity, accuracy, and consistency across all data assets through robust validation and monitoring mechanisms.
  • Collaborate closely with data scientists, data analysts, business intelligence developers, and application teams to understand data requirements and deliver appropriate data solutions.
  • Monitor data pipeline performance, identify bottlenecks, and implement optimizations to improve efficiency and reduce processing times.
  • Manage data lifecycle, including data archival, retention, and compliance with data governance policies and security standards.
  • Participate in code reviews, contribute to documentation, and adhere to engineering best practices.
  • Troubleshoot and resolve data-related issues in production environments.
  • Contribute to the evaluation and selection of new data technologies and tools.

Qualifications:

  • Experience: 5+ years of professional experience in data engineering, backend development with a strong data focus, or a related field.
  • Data Acumen: Strong understanding of data warehousing concepts, dimensional modeling, and data lake architectures.
  • Problem-Solving: Excellent analytical and problem-solving skills, with a keen attention to detail.
  • Communication: Good verbal and written communication skills, with the ability to articulate technical concepts to both technical and non-technical audiences.
  • Teamwork: Ability to work effectively in a collaborative team environment and contribute positively to team goals.
  • Agile: Experience working in an Agile/Scrum development methodology.

Education:

  • Bachelor’s degree/University degree or equivalent experience

Technical Skills

  • Big Data Processing: Strong proficiency with Apache Spark (DataFrames API, Spark SQL) using Scala or Python.
  • Databases: Expert-level SQL skills. Extensive experience with relational databases (e.g., PostgreSQL, Oracle, SQL Server, MySQL) and experience with cloud-native data warehouses (e.g., Snowflake, Google BigQuery, AWS Redshift) or data lake technologies (e.g., Delta Lake).
  • Programming Languages: Strong proficiency in Python or Scala.
  • ETL/ELT Tools: Experience with ETL/ELT methodologies and tools, including data orchestration tools (e.g., Apache Airflow, Azure Data Factory, AWS Step Functions, GCP Cloud Composer).
  • Cloud Platforms: Exposure to major cloud platforms (AWS, Azure, GCP) and their data services (e.g., S3, ADLS, GCS, EC2, Azure VMs, Kubernetes).
  • Version Control: Proficiency with Git and standard version control workflows.
  • Data Modeling: Experience in designing and implementing efficient and scalable data models.
  • Performance Tuning: Ability to optimize Spark jobs, SQL queries, and database performance.
  • Linux/Unix: Familiarity with Linux/Unix environments for scripting and job execution.

------------------------------------------------------

Job Family Group:

Technology

------------------------------------------------------

Job Family:

Systems & Engineering

------------------------------------------------------

Time Type:

Full time

------------------------------------------------------

Most Relevant Skills

Please see the requirements listed above.

------------------------------------------------------

Other Relevant Skills

For complementary skills, please see above and/or contact the recruiter.

------------------------------------------------------

Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law.

 

If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi.

View Citi’s EEO Policy Statement and the Know Your Rights poster.

Company Details
Citi
 
Work at Citi

About Citi Working at Citi is far more than just a job. A career with us means joining a team of more than 200,000 dedicated people from around... Read more

Did you submit an application for the Python Data Engineer - Engineer Intmd Analyst - C11 - CHENNAI on the Citi website?