Data Engineer AWS Cloud

19.01.2021 | Valencia (Spain) | Fulltime

If you fit in this offer, please send your resume:

Resumes that are not in English will not be considered

ICC is committed to achieving diversity and inclusion within its workforce, providing an environment that reflects the values enshrined in the Charter of the United Nations and encourages all qualified applicants, irrespective of gender, nationality, disabilities, sexual orientation, culture, religious and ethnic backgrounds to apply. ICC is dedicated to the SDGs, making SDG-5 (Gender Equality) and SDG-10 (Reduce Inequalities) the organization goals.

Location: Valencia (Spain)

The scope of work includes:
• Participate in the project specification and software design phases
• Design and Implement AWS architecture, AWS Solutions services and tools
• Design Native Cloud Application Architectures or optimize applications for AWS
• (S3, SQS, Lambda Comprehend, Transcribe, Faregate, Aurora, API GW , CFT ,SAM to build a data
flow pipeline)
• Data visualization skills in Tableau or AWS QuickSight are a plus, but not required.
• Collaborating within a project team to solve complex problems
• Ensuring the delivered solution meets the technical specifications and design requirements
• Be responsible for meeting development deadlines
• Perform other duties as required

The resource shall possess the following certifications:
• Certified AWS Developer – Associate or
• Certified AWS DevOps – Professional (Nice to have) or
• Certified AWS Big Data Specialty (Nice to have)
The resource should also have:
• AWS Certified Cloud Practitioner (Preferred, not required)
• Minimum of 5 years of experience with Data warehousing methodologies and modelling
• Minimum of 2 years of experience working in Massively Parallel Processing (MPP) Analytical
Datastores such as Teradata
• General understanding of the Snowflake architecture
• Minimum of 1 year of experience in handling semi-structured data (JSON, XML) using the
VARIANT attribute in Snowflake
• Minimum of 3 years of experience in creating master data datasets. Experience with MDM tool
is plus
• Minimum of 2 years of experience in Migration, methods to cloud data solutions
• Minimum of 3 years of experience in working with Batch and Stream data
• Minimum of 5 years of experience with SQL
• Minimum of 2 years of hands-on experience in Cloud technologies such as
o AWS – S3, Glacier, EC2, Lambda, SQS, Redshift, Athena, EMR, AWS Glue, AWS Lake
Formation, Kinesis, AWS Batch
o Azure – Blob Storage, Cool Blob Storage, Virtual Machine, Functions, SQL
Datawarehouse, DataFactory, Databricks, CosmosDB
• Minimum of 3 years of experience with ELT concepts
• Minimum of 3 years of experience with ETL tools such as Informatica
• Experience in Hadoop, Hive, HBASE, Spark is a plus
• Minimum of 5 years of RDBMS experience
• Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CoePipeline
CodeDeploy, etc.
• Bachelors or higher degree in Computer Science or a related discipline.
• Experience with containers (docker)
• Strong Python programming skills
• Strong scripting skills: Bash Shell and PowerShell
• General understanding and hands-on experience with terraform