DomainTools is seeking a Data Systems Engineer to join our team. Our engineers predominantly use python to gather, process and analyze large data sets and to build data services that are used by our customers for cybersecurity research and threat intelligence. This is a full-time permanent position in our offices in Seattle (Belltown).
At DomainTools, you will work as part of a collaborative team of smart and engaged engineers. Expect to spend most of your time solving complex data processing problems and building highly scalable backend services. You will work in an environment where productivity is fostered. You will spend very little time in meetings, and you won't be distracted by processes that get in the way of high-quality results. You will work with big data and machine learning that matters.
- Design, code and maintain massive data gathering, processing and delivery systems
- Discover, analyze and validate new data sets to add value for our customers
- Solve necessarily complex distributed systems problems as simply as possible
- Research and employ cutting edge techniques to move well beyond internet scale data
- Architect and build data delivery solutions in a microservice environment
- Provide code reviews and design feedback for teammates
- Work with both realtime and offline data processing pipelines to get the best data available to our customers as soon as possible
- As a senior member on the team, you will also mentor more junior engineers on the team
Key Applicant Qualifications:
- Python3 experience: 2+ years
- Data Engineering experience: 3+ years
- Software engineering experience: 5+ years.
- Experience with software development in a Linux/Unix environment.
- Experience with large-scale MySQL and NoSQL (Cassandra, ElasticSearch) data sets, or a strong desire to learn: 2+ years
- Experience processing and serving up big data sets with technologies like Hadoop, Hive, and NoSQL databases: 2+ years
- Experience with web-scale data collection
- Proficient with the technologies behind highly-scalable web services (e.g. caching, load balancing, sharding).
- Positive attitude with strong attention to detail and a desire to produce high-quality results.
- History of working effectively in a small team environment. A strong team player and ability to troubleshoot complex problems.
- Experience in designing, building and maintaining large-scale data infrastructures
- Code/Build/Deployment such as: GITLAB
- Bachelor's degree or higher in Computer Science or related field
- Excellent written and verbal communication skills
- Proven ability to interact, evangelize, present and influence at all levels of companies, from C-suite to engineers
- Excited about security space!
- Experience with data mining or machine learning techniques.
- Experience with vast array of text codec and encoding.
- Experience delivering micro-service architectures.
- Experience with CI/CD
- Experience with Kubernetes
- Java/Scala, C, or R development experience.
- Experience in Data Pipelines such as Kafka or Apache Airflow
- Experience with serverless technology such as Amazon Lambda or Google Compute Engine