Data Engineer: ETL, Spark - FS Poly
This is a very cool company - become a Spark Analytics expert, work with the best and brightest, make a lot of money, get equity that is expected to be very valuable in a short amount of time, and work on some of the best federal analytics projects. Some details are listed below.
Our client is a product and services firm, pre-IPO, that specializes in data science and analytics for the federal government and commercial market. They are seeking several Data Engineers / Federal Solutions Architects to join their Field Engineering team. The work is typically a 50/50 split between:
- development, building prototypes and proofs-of-concept and doing ETL on customer data, using technologies including Databricks (Apache Spark platform), Hadoop, AWS, Azure, and pretty much anything else you can think of that could be used in cloud-based analytics
- working with federal customers to understand their requirements, present your prototypes and proofs-of-concept, and help them figure out the best way of building an analytic solution that meets their needs
The key skills you need coming in the door are development (in either Java, Python, or Scala whichever you prefer) as well as SQL. ETL experience is a huge plus. Our client will provide extensive training and hands-on mentoring in analytic technologies, particularly Spark. (If you already have those skills, that is even better.) Salary will be great, plus equity (which can lead to substantial additional compensation), full benefits plan, etc.
Work locations will be at one or more customer sites close to you (if you live near Ft. Meade, you can support that customer; if you live in Northern VA, you can support customers close to your house). In many cases, part of the work can be done remotely using the company’s own clusters.
US citizenship is required. Security clearances up to and including FS Poly can be kept active - they have lots of customers throughout Civilian, Defense, and Intelligence agencies.
cjobs-cat:"IT - Software"