Planning and executing a variety of methodologies as part of the concept stage in the overall project development of data solutions. Creating GUI prototypes. Researching, planning and developing project strategies. Designing the architecture, analyze and develop mappings, transformations, mapplets, sessions and workflows for integrating the data residing using ETL tools. Creating ETL data pipelines using Pyspark for Data ingestion, ETL processing, AWS data pipeline service and AWS Glue for Data ingestion. Coordinating the building of solutions using ETL, Informatica, Tableau, Oracle, Teradata, AWS and Python scripts. Migrating ETL code from Teradata to Snowflake in AWS Cloud. Overseeing the implementation of test validations of the solutions. Ensuring the optimization of the developed solutions. Producing project documentation.
Requirements: Master’s Degree or foreign degree equivalent in Computer Science, Computer Information Systems, Computer Applications, Information Technology or Engineering and one year’s experience in position or one year’s experience in IT field (or Bachelor’s Degree and five years’ experience).
Special requirements: Experience with ETL, Informatica, Tableau, Oracle, Teradata, AWS and Python scripts. Travel to various unanticipated client sites required. May reside anywhere in the United States.
Any Applicant who is interested in this position may apply by regular mail (including Reference Number 14358) to:
Mr. Gordon Drijver
1001 East Palm Avenue
Tampa, FL 33605