Responsibilities
- Using the Talend ETL toolset, to create new and maintain existing ETL jobs.
- Design and implement ETL for extracting and transforming data from diverse sources, such as: Cloudera, PostgreSQL, and SQL Server databases.
- Design and develop database tables necessary along with the necessary constraints as per the requirement.
- Collaborate with Team members to understand source system structures and data retrieval methods/techniques, and tools within the organization.
- Support the development of data transformation logic using ETL tools or scripting languages like SQL, Python, etc.
- Clean, validate, and transform data to conform to target schema and quality standards.
- Work with the Team to execute data quality improvement plans.
- Participate in troubleshooting activities to maintain data integrity and process efficiency.
Requirements
- Degree in computer science, mathematics, or engineering
- 5-6 years experience with Talend, Python and Spark
- Should have good knowledge and working experience in Database and Hadoop (Hive, Impala, HDFS).
- Understanding of data-warehousing and data-modeling techniques
- Knowledge of industry-wide visualization and analytics tools • Good interpersonal skills and positive attitude
- Must be a team player and should have good communication skills to interact with client & team members.
Shortlisted candidates will be offered a 1 Year Agency Contract employment