Free H4 EAD Sponsor for DevOps Lead by tanu infotech inc
Hope you are doing Good
I am Gowttam – Technical Recruiter from Tanu Infotech Inc. We are a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions.
I have an opening for DevOps Lead at Seattle,WA with our client and we found your resume is a great match for this role. If you are open to discuss in detail about this role feel free to reach me at 860-697-7343 with your best available time for a quick discussion.
Role: DevOps Lead
Duration : 6+ Months
Client: Infostrech/ Navigating Care
• Lead team initiatives to continuously refine our AWS deployment practices for improved reliability, repeatability and security. You’ll create/contribute to plans, collaborate with other DevOps team members. These high-visibility initiatives will help to increase service levels, lower costs, and deliver features more quickly.
• Work closely with Engineering-Data team to automate deployment and configuration of infrastructure to support roll out of data products/projects on AWS Data Stack.
• Design effective monitoring / alerting (for conditions such as application-errors, high memory usage) and log aggregation approaches (to quickly access logs for troubleshooting, or generate reports for trend analysis) to proactively notify business stakeholders of issues and communicate metrics, working closely with these stakeholders, using tools including AWS CloudWatch, Datadog, ClearData etc.
• Write code and scripts to automate provisioning of AWS services and to configure services, using tools and languages including AWS CLI / API, Terraform, Ansible, Chef, Python, Bash, and Git.
• Configure build pipelines to support automated testing and deployments using tools including Jenkins, CircleCI, AWS CodeDeploy. You’ll configure these pipelines for specific products and help optimize them for performance and scalability.
• Help refine DevSecOps security practices (including regular security patching, minimum-permissions accounts and policies, encrypt-everything) in compliance with Health IT, government and other standards regulations, implement, and verify them, using tools like Sonarqube, VeraCode to analyze and verify compliance.
• Document and diagram deployment-specific aspects of architectures and environments, working closely with Software Engineers, Software Engineers in Test, and others in DevOps.
• Troubleshoot issues in production and other environments, applying debugging and problem-solving techniques (e.g., log analysis, non-invasive tests) , working closely with development and product teams.
• Suggest deployment patterns & practices improvements based on learnings from past deployments and production issues; collaborate with DevOps team to implement these.
• Promote a DevOps culture, including building relationships with other technical and business teams.
• Work closely with InterOps to deploy and configure the platform to on-board clinics.
• Work to ensure system and data security is maintained at a high standard, ensuring the confidentiality, integrity and availability of the Navigating Cancers applications is not compromised.
• Work on Setting up the framework for a universal artifact management tool like Artifactory.
• Ability to automate away manual interactions and have a passion for helping enable developers to write code that works
• A strong understanding of Linux administration including Bash scripting
• Networking expertise including VPCs, SDNs (e.g., Amazon / Azure) / VLANs, routers and firewalls
• Familiarity with at least one IAC / CM tool such as Terraform, Ansible, Chef, or Puppet
• Familiarity with at least one code build / deploy tool such as Jenkins, Circle CI
• Familiarity with DB setup, configuration and monitoring
• A bachelors degree in science, technology, engineering, or a similar field is required.
• Work in terms of enabling capabilities through a blend of process and technology
• 6+ years AWS administration experience / training including provisioning EC2 instances, VPCs, Elastic Beanstalk, Lambda functions, RDS Aurora Server/serverless databases, S3 storage, IAM security, ECS containers, Cloudwatch metrics & logs
• 5+ years of experience developing and / or deploying serverless functions using AWS Lambda, Azure Functions, or Google Cloud Functions
• Experience developing and / or deploying Docker Containers on ECS/EKS or Kubernetes
• Experience in automating provisioning of Infra to enable complete application ecosystem on demand
• 7+ years’ Experience with SQL; Adept in using RDS-PostgreSQL or other DBMS
• Experience with monitoring / alerting tools such as New Relic, Grafana, Prometheus, Sysdig
• Experience with log aggregation tools such as Datadog, ELK, Splunk
Thanks & Regards,
Tanu Infotech Inc
360 Bloomfield Avenue, Suite 301
Windsor, CT - 06095
Skills: cloud technologies, amazon,
Free H4 EAD Sponsor for REACTJS DEVELOPER by tanu infotech inc
Hope you are doing Good..!
I am Gowttam – Technical Recruiter from Tanu Infotech Inc. We are a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions.
I have an opening for React JS Developer at SFO, CA with our client and I found your resume is a great match for this role. If you are open to discuss in detail about this role feel free to reach me at 860 697 7343 with your best available time for a quick discussion.
Role: React JS Developer
Location: San Francisco, CA
Duration: 06+ months contract(Extendable)
NOTE: Should Work as lead for the position.
Position# :01(SFO, CA)
Seeking a skilled React JS Lead with at least 5 years of active and recent experience with React JS in developing enterprise web/mobile app in agile model. Must be comfortable as a consultant, and have strong written and verbal communication skills. The position is based in San Francisco CA.
Roles & Responsibilities:
• Requirement analysis, coding, design, implementation, testing, problem analysis and resolution, and technical documentation
• Identify inconsistencies in architecture and determine simpler and optimal programming solutions
• Able to create a design for small components and participates and contributes to research projects
• Understand technical requirements and how they relate and familiarity in general performance practices like caching, query optimization, memory utilization and clean-up, etc.
• Knowledgeable of all aspects of the project and can jump into support as needed and hands on, produces consistently solid project work and is highly valued by project team
• Familiarity on agile methodology and proficiency of participating in the stand up and handling tasks workflow
• Leading the team
• Minimum 2+ years of strong real time experience in developing React JS based applications with Redux
• Experience developing Adaptive or Responsive websites using UI technologies like HTML5, CSS3 and jQuery
• Experience in unit testing code with JEST / enzyme / Jasmine / Mocha / Chai is desired
• Experience in agile software development
• Awareness of new and emerging front-end technologies
• Good verbal, written, and presentation skills
• Able to Lead the team
• B.S. in computer science, software engineering, computer engineering, electrical engineering, or related area of study.
Thanks & Regards,
Tanu InfoTech Inc
360 Bloomfield Avenue, Suite 301
Windsor, CT - 06095
firstname.lastname@example.org | www.tanuinfotech.com
Skills: developer, front end web developer,
Free H4 EAD Sponsor for Bigdata/Hadoop Developer by Spout
Job Title: Senior Engineer – Big Data & Analytics
Job Location: Plano, Texas
Job Duration: 12+ Months
The Senior Engineer will report to the Senior Manager – Business Intelligence & Analytics.
• Developing automated methods for ingesting large data-sets into an enterprise-scale analytical system using Scoop, Spark and Kafka
• Identifying technical implementation options and issues
• Partners and communicates cross-functionally across the enterprise
• Ability to explain technical issues to senior leaders in non-technical and understandable terms
• Foster the continuous evolution of best practices within the development team to ensure data standardization and consistency
Core Competencies & Accomplishments:
• 8 + years of professional experience
• 3+ years of experience with Big Data technology and Analytics
• Good experience in Java
• 3+ years of experience in ETL and ELT data modeling
• Experience working with traditional warehouse and correlation into hive warehouse on big data technologies
• Experience setting data modeling standards in Hive
• Experience with streaming stacks like Spark
• Understanding of Big Data tools (e.g., NoSQL DB, Hadoop, Hbase) and API development consumption.
• Proficiency in using query languages such as SQL, Hive
• Understanding of data preparation and manipulation using Datameer tool
• Knowledge of SOA, IaaS, and Cloud Computing technologies, particularly in the AWS environment
• Knowledge of setting standards around data dictionary and tagging data assets within the Data Lake for business consumption.
• Experience in one or more languages (e.g., Python or Java, Groovy)
• Experience with data visualization tools like Tableau
• Experience in agile software development paradigm (e.g., Scrum, Kanban)
• Strong written and verbal communication.
Skills: none of the above skills,