Home / Jobs / Job Details

Data Architect and ETL development

Bruhat Insights
Location: Chennai, Experience: 7 years - 10 years
Posted On: 07-May-2021 | Last Date to Apply: 09-Aug-2021 | No: of Vacancies : 1 | CTC: 8 to 14 Lacs

Company Profile:

Bruhat is an AIHR Artificial Intelligence in HR company that utilizes artificial intelligence to make it easier for hiring corporates to not only effectively manage their people requirements but also obtain actionable insights to drive productivity and engagement

Job description:

Develop ETL mappings, interacting with large data processing pipelines in distributed data stores, using cloud-based ETL tools.
• Determine database structural requirements by analyzing client operations, applications, and programming, while reviewing objectives with clients and evaluating current systems
• Work with application DBA and modelers to construct data stores;
• Define database physical structure and functional capabilities to accommodate data integration requirements, security, back-up, and recovery specifications
• Ensure data is ready for use by consuming application, analyst and scientist using frameworks and microservices to serve data.
• Collaborate with data architects, modelers and IT team members on project goals
• Ensure optimum performance techniques for data integration, coordinate deployment actions, and document actions
• Integrate new data management technologies and software engineering tools into existing structures
• Maintain overall performance of components involved in Data Integration through ETL by identifying and resolving production and application development problems
• Answer user questions
• Provide maintenance and support to data integration and ETL components by coding utilities, and resolving problems


Educational Qualifications and Experience:
§ Education: Bachelor’s degree in Computer Science/Engineering
§ Role Specific Experience: 7+ years of hands-on experience working on ETL mappings, interacting with large data processing pipelines in distributed data stores, and distributed file systems using cloud-based ETL tools such as IICS and Azure Data Factory/SSIS.
§ Extensive experience coding complex SQL queries in one or more leading RDBMS e.g. Oracle, Azure Synapse, MS SQL Server, Postgres etc.


Required technologies:
o Tools: Informatica Cloud Services (IICS), Informatica PowerCenter 10.x, Azure Data Factory
o Database: Postgres, Oracle 19c, Azure Synapse/SQL DW/ SQL DB, SQL Server 2016/2014/2012
o Cloud Technologies: Azure Microsoft Technologies, Unix, Linux, Windows

Required Skills/Abilities:
• Advanced knowledge of Data Integration concepts and standard approaches
• Strong leadership and communication skills
• Ability to work independently once guidance and goals are provided

Desired / Abilities (not required but a plus):
Experience in one or more of the following technologies:
o Data dictionaries
o Data warehousing
o Enterprise application integration
o Metadata registry
o Master Data Management (MDM)
o Relational Databases
o NoSQL
o Semantics
o Data retention
o Structured Query Language (SQL)
o Procedural SQL
o Unified Modeling Language (UML)
o XML, including schema definitions (XSD and RELAX NG) and transformations
o Additional consideration will be given to those individuals who possess the following specific competencies (in no specific order): Tibco Data Virtualization, SAP BW/Hana

Key Qualifications

Bachelors

Education

Any Bachelors Degree

Skills

ETL,informatica,Azure,

Industry

IT-Software- Software services

Gender Preference:

Any

Job Type:

Full Time

Diversity Tags:

Not Applicable