Short DescriptionHuQuo is seeking a Cloud Architect who is an expertise in troubleshooting very complex distributed environments, including following stack traces back to code and come up with a good root cause.
- An architect on our Cloud Platforms and Infrastructure team has close to 5 8 years of experience in DevOps, SecOps, or SysOps in a large enterprise environment and would take a lead role in the design and analysis of a Cloud-based Big Data and Analytics solution in a variety of industries and computing environments.
- Architects plan system and development deployment and are responsible for meeting software compliance requirements.
- A key part of consulting versus being in an IT role is that you would be able to articulate the differences between hardware and software options, operational requirements, and characteristics of the overall systems to help shape the solution.
- Your expertise in the use of cloud architecture and solutions to support Big Data and Analytics solutions would enable software deployment in a DevOps environment.
- You would have a strong comprehension of a wide application of technical principles, theories, and concepts in the field and apply general knowledge of related disciplines. Mentor Junior resources with your ideas and makes them leaders
- You'd provide technical solutions related to supporting Big Data in the cloud to a wide range of difficult problems.
- Bachelors Degree or equivalent experience
- 5 years of experience in a Cloud Engineering role
- Extensive experience in multiple operating systems - UNIX, Windows, LINUX
- 3+ years experience with multi-threaded, big data, distributed Cloud architectures and frameworks, including Hadoop, MapReduce, Spark, Hive, and Elastic Search for the purpose of conducting big data analytics
- Strong experience (5+ years) with Amazon Web Services, Azure or Google Cloud Platform and all supporting technologies
- Expertise in troubleshooting very complex distributed environments, including following stack traces back to code and come up with a good root cause
- Strong background with Extract, Transform and Load (ETL) processes including document parsing techniques and mapping large data sets such as multi TB scale
- Willingness and appetite for continuous learning
- Desire to participate in the hiring and development of the team.
- AGILE (Scrum, Kanban, SAFe)
- At least one Certification required- AWS (Solution Architect), AZURE (MSCE Cloud Expert), Google (Certified Data Engineer, Certified Cloud Admin)
- VLDB, OLTP, OLAP, EDW, Big Data, Streaming, Advanced Analytics efforts in the cloud
- BS Degree in CS, Statistics, Physics, Math, Engineering, etc.