TITLE: SENIOR DATA ARCHITECT – BIG DATA
JOB ID: 10898
LOCATION: Chicago, IL - MUST live local to Chicago
STATUS: US Citizen or Green Card Holders Only, Hybrid On-Site 2-3 days a week.
About the Client:
Join our client that is connecting the world and transforming the satellite landscape. They are defining new products that will bring the internet to every device, every flight and everywhere you are. Their worldwide inflight Wi-Fi services have made internet and video entertainment a regular part of flying.
About the Senior Data Architect – Big Data Position:
You will play a vital role in leading and managing the full lifecycle of big data solutions including development of our data architecture strategy, roadmap, and implementation plans.
About the Senior Data Architect – Big Data Responsibilities:
- You will function as the primary point of contact for data architecture consultation required by technical teams and recommend best practices in approach and design to development teams
- Demonstrate an understanding of customer requirements and existing environments to better translate these into AWS configurations that meet performance, scalability, and availability needs
- Architect solutions to enable best-in-class analytic platforms, real-time monitoring and broaden our business intelligence and reporting capabilities
- Work collaboratively with the business and diverse team of developers in an agile environment providing critical architectural input to support new data features and services
- Ensure that expected infrastructure quality levels are achieved
- Utilize industry information management trends to innovate and provide new project/product ideas within the BD lifecycle
About the Senior Data Architect – Big Data Requirements:
- 7+ years of relevant architecture and development experience of Big Data Solutions (Hadoop)
- 5+ years utilizing cloud-based services (AWS)
- Technical proficiency including RDBMS, DBMS, AWS, cloud computing, NoSQL, Apache Accumulo, Hadoop technologies (Hive, Pig), data mining and modeling tools, programming languages (Python, Java), operating systems and backup systems
- Expertise in implementing AWS services in a variety of environments
- Expertise building data lakes and data warehouses in AWS using a variety of cloud services EMR, EC2, S3, Kinesis, Lambda, DynamoDB, Glue, Athena, Redshift, SageMaker, Glacier.
- Expertise in Analytics data architecture and data modeling
- Expertise in data analysis and writing and developing complex queries using SQL
- Bachelor’s degree in Computer Science, Computer Engineering, Information Systems, or related technology