Data Application Lab 自2018年12月10日起,开启全新栏目,筹备良久,每天为大家提供1个内推的机会,独家资源,更直接的求职匹配。

希望积极寻求相关领域工作的你每天关注我们的工作申请机会,如果有任何问题欢迎通过文章或者后台留言的方式与大家讨论。

内部资源

只对我们的读者定期放送

Senior Staff Engineer

Santa Clara CA

Hours per Week

Background Check Requirement

Competitive Salary

Responsibilities:

- Build data pipelines and ETL from heterogeneous sources including IoT using Kafka, Flume, Sqoop, Spark Streaming and custom code etc. Your solutions will support both real-time or near real-time applications

- Partner with product owners, data analyst and data scientists, to better understand their requirements, find bottlenecks and come up with the best resolution, etc.

- Expand and grow data platform capabilities to solve new data problems and challenges

- You will work closely with development and QA teams both local and oversea to design ingestion pipelines

- You will also need to mentor other team members

Desired Skills & Experience:

- 6+ years of software development experience with minimum 2-year hands-on experience in Hadoop ecosystem and other Big Data technologies (Kafka, Cassandra, MongoDB and Elastic search)

- Expert level software development experience using Java and/or Python

- Hands-on experience with building big data ingestion pipelines using Hadoop, HBase, Kafka, Elastic Search etc

- Knowledge and hands-on experience with relational databases (Greenplum  and Oracle MySQL/Postgres etc) and database schema design

- Experience with Unix/Linux and shell scripting

- Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes and testing

- Ability to take a project from scoping requirements through actual launch of the project

- BS/MS Degree in Computer Science or related field

- Excellent inter-personal and teamwork skills, having experiences working with oversea teams is a plus

如果你感兴趣这个职位

请将最新的 Resume 和 一段简要自我介绍

Subject Line: Job#ID+Name

往期精彩回顾