Short DescriptionDeloitte is looking for Data Delivery-Big Data-Senior Consultant who will focus on managing the information supply chain from acquisition to ingestion, storage and the provisioning of data to points of impact by modernizing and enabling new capabilities.
- Function as integrators between business needs and technology solutions, helping to create technology solutions to meet client's business needs.
- Identifying business requirements, requirements management, functional design, prototyping, process design (including scenario design, flow mapping), testing, training, defining support procedures and supporting implementations.
- The Team
- Analytics & Cognitive
- In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.
- 3+ years of relevant technology architecture consulting or industry experience to include experience in Information delivery, Analytics and Business Intelligence based on data from hybrid of Hadoop Distributed File System (HDFS), non-relational (NoSQL, MongoDB, Cassandra) and relational Data Warehouses
- At least 1 year hands-on on working experience with Big Data technologies; MapReduce, Pig, Hive, HBase, Sqoop, Spark, Flume, YARN, Kafka, Storm etc.
- 1+ years of hands on experience with data lake implementations, core modernization and data ingestion.
- 1 or more years of hands on experience designing and implementing data ingestion techniques for real time and batch processes for video, voice, weblog, sensor, machine and social media data into Hadoop ecosystems and HDFS clusters.
- 2+ years experience leading workstreams or small teams
- Willingness for weekly client-based travel, up to 80-100% (Monday-Thursday/Friday)
- Bachelor's Degree or equivalent professional experience
- AWS Certification, Hadoop Certification or Spark Certification
- Experience with Cloud using Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)
- Experience with data integration products like Informatica Power Center Big Data Edition (BDE), IBM BigInsights, Talend etc.
- Experience designing and implementing reporting and visualization for unstructured and structured data sets
- Experience in designing and implementing scalable, distributed systems leveraging cloud computing technologies like AWS EC2, AWS Elastic Map Reduce and Microsoft Azure
- Experience designing and developing data cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching
- Knowledge of data, master data and metadata related standards, processes and technology
- Experience working with multi-Terabyte data sets
- Experience with Data Integration on traditional and Hadoop environments
- Ability to work independently, manage small engagements or parts of large engagements.
- Strong oral and written communication skills, including presentation skills (MS Visio, MS PowerPoint).
- Strong problem solving and troubleshooting skills with the ability to exercise mature judgment.
- Eagerness to mentor junior staff.
- An advanced degree in the area of specialization is preferred.
Data Delivery-Big Data-Senior Consultant