Big Data Cloud Engineer

col-narrow-left
Job ID:
2505606
Location:
Charlotte, NC
Category:
Information Technology, Telecommunications, Array
Salary:
$125,000.00 per year
Zip Code:
28201
Employment Type:
Full time
Posted:
09.15.2018
col-narrow-right
col-wide

Job Description:

*We are unable to sponsor for this permanent Full time role*

Prestigious Fortune 500 Company is currently seeking a Big Data Cloud Engineer. Candidate will be responsible and accountable for the engineering and operation of the company big data and analytics platforms.

Responsibilities:

Oversee and develop the various Big Data capabilities in cloud, aligning them with business strategies and requirements.
Architect solutions for massive scale, resiliency and maintainability, leveraging various cloud providers, which meet technical, security, and business needs for applications and workloads.
Champion good engineering practices and help teams to define and set up frameworks for Big Data As A Service.
Contribute to technology strategy and engineering roadmaps around Big Data In Cloud platforms and execute strategic engineering proof of concepts.
Drive adoption of platform within business units.
Develop monitoring strategies for infrastructure, platforms and applications aligning with enterprise strategy and overall industry trends.
Champion the appropriate use of open source and commercial technology based upon industry trends and innovative concepts.

Qualifications:

Degree in Computer Science, MIS, or related area, or equivalent work experience.
5+ years of relevant senior level experience in infrastructure, analytics or solution design/architecture.
Demonstrable knowledge of Amazon Web Services or similar cloud computing platform.
Experienced in utilizing infrastructure as code approach to provisioning.
Good understanding of Linux - preferably RHEL.
Technical leadership and solution design.
Hands-on style - willingness and competence in producing necessary changes in our infrastructure and processes.
Able to work effectively across organizational and geographical boundaries.
Ability to clearly communicate ideas and solutions.
Demonstrable ability to learn new technologies quickly.

Preferred Skills:

Big Data Technologies such as Hadoop, EMR, Spark, Impala, Kafka, etc.
Data Warehousing, SQL, Relational Databases.
NoSQL Databases.
Development skills - Java, Scala, Python, PERL, Shell Script.
Experienced in data integration/migration and ETL processing.
Storage - NAS, SAN, JBOD, Object Storage.
Automation, Configuration Management (eg Ansible, Puppet), DevOps practices, CI/CD pipelines (eg Jenkins).
Basic networking skills - switching, routing, Firewalls, load balancing.
Linux Containers/Docker.
Workflow scheduling and management.
DR and business continuity planning.

Company Info
Request Technology - Craig Johnson