Job Details: Big Data Architect

Job Title : Big Data Architect
Job Ref No : 23621-0188
Job Posted Date : 5/5/2016
Job State : California
Job City : EL Segundo, CA.
Who Can Apply : US Citizens, GC, EAD, TN, H1B
Telecommute Needed : No
Travel Required : No
Skills Required : Architect,Hadoop Architect

Job Description :

Big Data Architect
Location: EL Segundo, CA.
Duration: 6+ Months
Interview: Phone then Skype

Note: I need visa copy with submittal.

Job Description:
Experience : 11+ years (less than 17 years)

Hadoop, Hive,Hbase

Data architect who will work in an agile environment to architect and design solutions in the BI/ Big Data and analytics space for one of its key clients.
The candidate will work closely in a team consisting of other big data and data warehouse engineers, business analysts, business intelligence engineers, system analysts, quality assurance, database administrators and project managers in designing and developing an Enterprise data Hub (EDH) for one of the largest broker dealer networks in US.
The candidate will design and implement strategies, architectures, data ingestion, storage, consumption and delivery processes for complex, large-volume, multi-variety, batch and real time data sets used for modeling, data mining, dash boarding and reporting purposes
The candidate must be able to meet stated requirements, design best-fit solutions, develop detailed specifications and test plans for implementation. Since EDH works with multiple business units to create cross functional solutions, the candidate must be able to interface with various business SMEs to understand the requirements and prepare documentation to support development.
The candidate must be proficient in the Hadoop ecosystem having used Hadoop 1.0 and Hadoop 2.0 tools to ingest, normalize and analyze data to meet business objectives. Analysis must include experience in descriptive, predictive and prescriptive types and tools for the same.
The candidate must be comfortable with developing data-centric and reporting applications using tools like Qlikview, Tableau, Pentaho, Datameter or similar.
The candidate must be able to architect, design, review, build robust ETL or ELT tools & scripts and optimize data transformation from various external data sources
The candidate must be open to learning Splice Machine, The first RDBMS powered by Hadoop and Spark.
The candidate must have excellent communication skills, work well in a team environment, enjoy solving complex problems and be able to work in a fast paced environment.

Essential Skills:
Expert experience of Linux RHEL O/S environments
Expert experience in RDBMS (ANSI SQL queries, views, stored procedures, import/export scripts)
Expert experience with logical, 3NF and Dimensional data models.
Working experience with Apache Sqoop
Working experience in ETL or ELT
Working experience in Java and Shell script
Working experience in Data Mining/Data Warehousing/Business
Intelligence concepts
Working experience with Cloudera cluster
Working experience with Business Intelligence tools and platforms like Wherescape Red, QlikView /QlikSense, Tableau and Datameter, a plus
Working experience with Oracle RAC and MS SQL Server
Working experience with data quality tools such as Informatica IDQ
Working experience in working in an Agile/SCRUM model
Working knowledge of data, master data and metadata related standards, processes and technology

Soft Skills:
Excellent oral and written communication skills
Excellent customer service skills
Excellent analytical and problem solving skills
Ability to work independently with minimal assistance;
Ability to share knowledge and coach / mentor a team;

Ability to review technical documentation to verify compliance with established architectural standards and guidelines.

Are you Looking for a IT Training or IT Job?

Quick Apply this Job!

(doc, docx, rtf, pdf, txt were Acceptible)

quick Apply this job - It takes < 1 min to Apply!

Related Jobs

Senior AEM Architect / Developer

** Immediate Client requirement - Senior AEM Architect / Developer - MA** Greeting from Conflux System Inc …!!! ..

Pega Architect

The PEGA SSA’s primary responsibility will be to develop and/or modify programs, successfully unit test their work, deve..

Big Data

Role: Big Data ArchitectLocation: Costa Mesa, CADuration Full Time**U.S. Citizens and those authorized t..

Big Data Architect

Responsibilities will include: • Design, architect and build data platform solutions using Big Data Technologies • O..

Citrix Architect

Hi,I am having direct client requirement.Please refer the JD and let em know your interest.Role:..

Information Technology Systems/ Appl..

APPLY ONLINE AT: Primary DutiesHealth Technology Solutions (HTS) provides tec..

Information Technology Specialist 5 ..

Primary DutiesHealth Technology Solutions (HTS) provides technology solutions and management in support of the a..

Teamcenter Architect

Background in Teamcenter ITK programming, PLMXML, BMIDE, Teamcenter SOA, Experience with ASP and Linux is a plus.Exp..

Teradata Architect

• Must have worked with large databases at multiple Locations. • Understand the net effect of data sharing among server..

Big Data Architect

Big Data ArchitectLocation: EL Segundo, CA.Duration: 6+ MonthsInterview: Phone then SkypeNote: I nee..