1 Click Easy Apply


Data Analyst for Big Data


Dearborn, Michigan


Data Analyst for Big Data Job Opening in Dearborn, Michigan - Exp. Range: 5.0 to 8.0 SKILLS:
Title: Data Analyst for Big Data
Location : Dearborn, MI
Duration Long Term
Requirements Engineer:
Job Summary:
The Java/Hadoop Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages which are aligned with other Client internal projects.
Essential Job Functions:
Design and development of data ingestion pipelines.
Perform data migration and conversion activities.
Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
Required :
Java , J2EE, Web Applications, Tomcat (or any equivalent App server) , Restful Services, JSON
Spring, Spring Boot, Struts, Design Patterns
Hadoop (preferably Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase, SCALA,SQL ,Linux Good to Have: 8. Google Analytics, Adobe Analytics
Python, Perl, Flume, Solr
Strong Database Design Skills
ETL Tools NoSQL databases (Mongo, Couchbase, Cassandra)
JavaScript UI frameworks (Angular, NodeJS, Bootstrap) Other
Responsibilities :
Document and maintain project artifacts.
Suggest best practices, and implementation strategies using Hadoop, Java, ETL tools.
Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
Other duties as assigned. Minimum Qualifications and Job Requirements:
Must have a Bachelor¿s degree in Computer Science or related IT discipline
Must have at least 5 years of IT development experience.
Must have strong, hands on J2EE development
Must have indepth knowledge of SCALA
Spark programming
Must have 3+ years relevant professional experience working with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, , SQL, PERL, Python or equivalent scripting language
Must have experience with ETL tools
Must have experience integrating web services
Knowledge of standard software development methodologies such as Agile and Waterfall
Strong communication skills.
Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary Specific Knowledge, Skills and Abilities:
Ability to multitask with numerous projects and responsibilities
Experience working with JIRA and WIKI
Must have experience working in a fast paced dynamic environment.
Must have strong analytical and problem solving skills.
Must have excellent verbal and written communication skills
Must be able and willing to participate as individual contributor as needed.
Must have ability to work the time necessary to complete projects and/or meet deadlines.
Job Type: Contract
Required education:
Bachelor's
Required experience:
Big Data: 2 years

1 Click Easy Apply

TalentEinstein.com - Superhuman AI Recruiting Assistant | Terms & Conditions

All rights reserved
Swanco LLC