See Similar Listings
Job   USA   MA   Boston Area   Engineer   Bigr.io -

Data Engineer- AWS to GCP | Engineer in Engineering Job at Bigr.io in Boston MA | 6201005050

This listing was posted on The Resumator.

Data Engineer- AWS to GCP

Location:
Boston, MA
Description:

Senior Data Engineer Remote- East Coast Hours W2 only, no C2C! About BigRio: BigRio is a remote-based, technology consulting firm with headquarters in Boston, MA. We deliver solutions ranging from: custom development, software implementation, data analytics, and machine learning/AI integrations. We are a one-stop shop that attracts clients from a variety of industries because of our proven ability to deliver cutting-edge and cost-conscious software solutions.With extensive domain knowledge, BigRio has teams of data architects, data engineers, software engineers, web developers, and consultants who deliver best-in-class solutions across a variety of verticals. Our diverse industry exposure equips us with invaluable tools, tricks, and techniques to tackle complex software and data challenges.Our thought-forward, Big Data team is working on a number of data architecture and software-solution projects. You will join this high-caliber team as a Data Engineer who will work with our clients. About the Job: The Data Connectivity Team designs, builds, and maintains the integrated platform that securely procures and links critical business data from disparate internal and external sources. This involves assimilating all structured, unstructured, and semi-structured data. The Data Engineer will touch all aspects of the data operation, in particular, data infrastructure, to ensure a robust, efficient, and consistent foundation for enterprise consumption & application development. Responsibilities: Work with MicroServices in AWS and GCP Engineer the migration from AWS to GCP - ELT/ETL skills Skilled in working with GCP BigQuery Build robust, scalable, and fault-tolerant ETL pipelines that allow flexibility and efficiency with minimum overhead and maintenance Responsible for data integrity and consistency and quality of new releases Provide mentorship and technical leadership to junior data engineers of the team Work with other architects and engineers to define, execute, and update core data systems while maintaining a high level of availability and transactional correctness.. Help define future technical directions for data systems in collaboration with senior management, product management, and stakeholders. Minimum Job Qualifications: 7+ years of full-time professional experience in architecting, building and optimizing ETL pipelines and data warehouse to onboard and streamline data cleansing, transformation, standardization and aggregation processes with thoughtful design mindset centered around flexibility, robustness, computing efficiency and maintenance Hands-on proficiency in SQL and data engineering libraries in at least one scripting language e.g. Python along with a deep understanding of the underlying algorithm and execution efficiency Proven track record of owning data integrity and QA, with an exceptional attention to detail Expert at data modeling and system design Comfort in working in a fast-paced environment with moving targets and changing priorities Proven ability to speak both business and technology, and effectively liaison with both teams Demonstrated expertise in Information Architecture, Data Engineering, and data warehousing Familiarity with Analytics and Data Science Strong experience designing distributed systems for scale and high availability. Extensive experience in designing microservice based applications and data pipelines, concepts of ETLs. Demonstrated data management experience with SQL database platforms: MySQL, Postgres, Redshift, Snowflake, Oracle, or SQL Server. Deep experience with AWS and GCP Proven ability to collaborate, build relationships, and influence stakeholders at all levels in a matrix-management environment. Preferred Job Qualifications: Direct experiences with GCP BigQuery Hands-on experiences with web analytics, 3rd party augmentation data, and A/B testing is a plus Experience in hosting, provisioning and maintaining database servers in cloud environment, preferably GCP, is a plus Equal Opportunity Statement BigRio is an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, religion, national origin, sex, sexual orientation, gender identity, age, pregnancy, status as a qualified individual with disability, protected veteran status, or other protected characteristic as outlined by federal, state, or local laws. BigRio makes hiring decisions based solely on qualifications, merit, and business needs at the time. All qualified applicants will receive equal consideration for employment.Powered by JazzHR
Company:
Bigr.io
Posted:
August 16 on The Resumator
Visit Our Partner Website
This listing was posted on another website. Click here to open: Go to The Resumator
Important Safety Tips
  • Always meet the employer in person.
  • Avoid sharing sensitive personal and financial information.
  • Avoid employment offers that require a deposit or investment.

To learn more, visit the Safety Center or click here to report this listing.

More About this Listing: Data Engineer- AWS to GCP
Data Engineer- AWS to GCP is a Engineering Engineer Job at Bigr.io located in Boston MA. Find other listings like Data Engineer- AWS to GCP by searching Oodle for Engineering Engineer Jobs.