- Location: US, US
- Posted: Mar 26
- Should be hands on big data technologies ( Data bricks, Python, Spark, PySpark and Redshift).
- Worked on Scheduling tools – Airflow.
- Design the framework for Data Ingestion process. Provide design solution based on the assessment document and review with the customer.
- Co-ordinate with the offshore team to review the code. Co-ordinate with customer to get the access for the environment for the offshore team.
- Hands on the technologies tools mentioned. Provide / assist the testing team.
- Co-ordinate/resolve the issues or clarify during UAT.
- Hands on POC implementation using Snowflake, AWS component
- Production Migration and Support.
Other locations Columbus,OH