Job Title: Big Data Technical Analyst
Location: Lagos
Job Description
- A Big Data Technical Analyst proficient with Python, Apache Hadoop, Apache Cassandra and other Big Data and Enterprise Analytics tools is needed to join a dynamic team.
Responsibilities
Advertisements
- Designs, develops and tests new and/or existing data solutions running on the client’s platforms.
- Analyzes technical and business requirements to develop a systems solution that aligns with enterprise best practices including user experience and accessibility.
- Utilizes agile software development practices, secure coding practices, code reviews, and software architecture.
- Uses knowledge of distributed computing techniques to design, develop and test scalable ETL/Processes that operate on large volume datasets.
- Familiar with handling datasets containing mixes of structured and unstructured data.
- Transforms unstructured data into suitable forms for analysis and modeling.
- Performs extract, transform and load (ETL) integrations with variety of data sources.
- Writes ad-hoc scripts and queries, schedules batch jobs and develops real-time streaming applications and monitors.
- Works with team to identify and build efficient and scalable processes.
- Be part of a development team to build and maintain the ETL processes
- Develop, test, debug, document and help operationalize various ETL applications leveraging industry best practices
- Work closely and collaborate with various cross-functional IT teams to identify, troubleshoot and fix data issues, and resolve data gaps that impact the fulfillment of the businessТs functional requirements
Qualification/ Experiences
- At least 3 years hands-on ETL experience
- Strong foundation in big data concepts – map-reduce, RDDs, batch and stream processing, data formats, ETL process flow
- Experience in Python programming.
- Proficient in Spark and Spark streaming architecture.
- Utilizes agile software development practices, secure coding practices, code reviews, and software architecture.
- Uses knowledge of distributed computing techniques to design, develop and test scalable applications that operate on large volume datasets.
- Familiar with handling datasets containing mixes of structured and unstructured data.
- Transforms unstructured data into suitable forms for analysis and modeling.
- Writes ad-hoc scripts and queries, schedules batch jobs and develops real-time streaming applications and monitors.
- Experience using Big Data based applications/tooling/languages such as Hadoop, Spark, Kafka, Hive, HBase.
- Hands-on telco experience.
Method of Application
Interested and qualified candidates should send their CV (in Microsoft word), current and expected salary to: [email protected] , [email protected] , [email protected] and [email protected] with the Job Title as subject of the mail.
Job Features
Deadline | 28th August, 2020. |