AWS Data Engineer Reston

DS Technologies Inc

AWS Data Engineer

Full Time • Reston
About US: We are a company that provides innovative, transformative IT services and solutions. We are passionate about helping our clients achieve their goals and exceed their expectations. We strive to provide the best possible experience for our clients and employees. We are committed to continuous improvement and innovation, and we are always looking for ways to improve our services and solutions. We believe in working collaboratively with our clients and employees to achieve success.
 
DS Technologies Inc is looking for AWS Data Engineer role for one of our premier clients.

Job Title: AWS Data Engineer  
Location: Reston, VA (Hybrid)
Position Type: Contrcat
ONLY W2
 
Job Requirement
  • Work with product owners and other development team members to determine new features and user stories needed in large/complex development projects 
  • Create or Update documentation in support of development efforts. Documents may include detailed specifications, implementation guides, architecture diagrams or design documents. 
  • Participate in code reviews with peers and managers to ensure that each increment adheres to original vision as described in the user story and all standard resource libraries and architecture patterns as appropriate. 
  • Respond to trouble/support calls for applications in production to make quick repair to keep application in production. 
  • Mentor or provide technical guidance to less experienced staff; may use high end development tools to assist or facilitate development process. 
  • Lead projects/initiatives end to end with little supervision 
  • Leverage Fannie Mae DevOps tool stack to build, inspect, deploy, test and promote new or updated features. May serve as technical lead, architect, project lead or principal developer in course of large or complex project. 
  • Expert proficiency in unit testing as well as coding in 1-2 languages (e. g. Python, Java, etc.). 

Minimum Required Experiences:
  • 5+ years of relevant experience 
  • Specialized Knowledge and Skills 
  • 3 or more years of experience with Python, SQL, Spark and Amazon EMR 
  • 3+ years of experience in building and deploying applications in AWS (S3, Lambda, Elastic Beanstalk, Hive, Glue, Redshift, RDS, CloudWatch, SNS, SQS, Kinesis etc.) 
  • Experience processing large amounts of structured and unstructured data 
  • Experience with CI/CD with knowledge of Git, Bitbucket, Jenkins 

Tools:
• Skilled in AWS Compute such as EC2, Lambda, Beanstalk, or ECS, EMR 
• Skilled in strong SQL 
• Skilled in AWS Database products such as RDS, Redshift, or Aurora 
• Skilled in AWS Management and Governance suite of products such as CloudTrail, CloudWatch 
• Skilled in Amazon Web Services (AWS) offerings, development, and networking platforms 
• Experience / Knowledge in Statistics Based Programming Language R for Data Visualization 

We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.





(if you already have a resume on Indeed)

Or apply here.

* required fields

Location
Or
Or

U.S. Equal Opportunity Employment Information (Completion is voluntary)

We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

You are being given the opportunity to provide the following information in order to help us comply with federal and state Equal Employment Opportunity/Affirmative Action record keeping, reporting, and other legal requirements.

Completion of the form is entirely voluntary. Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file.