PubMatic (Nasdaq: PUBM) is an independent technology company maximizing customer value by delivering digital advertising’s supply chain of the future. PubMatic’s sell-side platform empowers the world’s leading digital content creators across the open internet to control access to their inventory and increase monetization by enabling marketers to drive return on investment and reach addressable audiences across ad formats and devices. Since 2006, our infrastructure-driven approach has allowed for the efficient processing and utilization of data in real time. By delivering scalable and flexible programmatic innovation, we improve outcomes for our customers while championing a vibrant and transparent digital advertising supply chain.
SkillsAutomation Testing,big data testing,etl testing,backend testing, | IndustryIT-Software- Software services | Gender PreferenceAny | Job TypeFull Time |
Diversity Tags Not Applicable | Key QualificationsBachelors | EducationB.E | LocationPune, |
Responsibilities:
Testing big data ingestion and aggregation flows using spark shell and related queries.
Developing automation framework using programming languages such as python and automate the big data workflows such as ingestion, aggregation, ETL processing etc.
Debugging and troubleshooting issues within the big data ecosystem.
Set up the Big data platform and Hadoop ecosystem for testing.
Define test strategy and write test plan for the data platform enhancements and new features/services built on it.
Define the operating procedures, service monitors and alerts and work with the NOC team to get them implemented.
Responsible for system & performance testing of the data platform and applications
Solve problems, establish plans, and provide technical consultation in the design, development, and test effort of complex engineering projects.
Review product specifications and write test cases, develop test plans for assigned areas.
Identifies issues and technical interdependencies and suggest possible solutions.
Recreate complex customer and production reported issues to determine root cause and verify the fix.
Requirements:
Should have 1-3 years of experience working as SDET and doing meaningful automation.
Good programming skills. Python preferred.
Hands on experience in automating backend applications (e.g., database, REST API's)
Hands on experience with automating any backend applications (e.g., database, server side).
Knowledge of relational databases and SQL.
Good debugging skills.
Working experience working in Linux/Unix environment.
Good understanding of testing methodologies.
Good to have hands on experience in working on Big Data technologies like Hadoop, Spark
Quick learner and good team member with positive attitude.
Good verbal and written communication skills.
Qualifications:
Primary (Mandatory) Skills:
Good hands-on experience with Unix/Linux
Good hands-on experience in writing python codes
QA Methodologies understanding.
Secondary Skills (Good to have):
Experience in Big data platform & data analytics testing is an advantage.
Knowledge on distributed systems and technologies like Hadoop and spark.
Thank you for applying! We'll notify you with updates on the next steps soon. Best of luck!
Responsibilities:
Testing big data ingestion and aggregation flows using spark shell and related queries.
Developing automation framework using programming languages such as python and automate the big data workflows such as ingestion, aggregation, ETL processing etc.
Debugging and troubleshooting issues within the big data ecosystem.
Set up the Big data platform and Hadoop ecosystem for testing.
Define test strategy and write test plan for the data platform enhancements and new features/services built on it.
Define the operating procedures, service monitors and alerts and work with the NOC team to get them implemented.
Responsible for system & performance testing of the data platform and applications
Solve problems, establish plans, and provide technical consultation in the design, development, and test effort of complex engineering projects.
Review product specifications and write test cases, develop test plans for assigned areas.
Identifies issues and technical interdependencies and suggest possible solutions.
Recreate complex customer and production reported issues to determine root cause and verify the fix.
Requirements:
Should have 1-3 years of experience working as SDET and doing meaningful automation.
Good programming skills. Python preferred.
Hands on experience in automating backend applications (e.g., database, REST API's)
Hands on experience with automating any backend applications (e.g., database, server side).
Knowledge of relational databases and SQL.
Good debugging skills.
Working experience working in Linux/Unix environment.
Good understanding of testing methodologies.
Good to have hands on experience in working on Big Data technologies like Hadoop, Spark
Quick learner and good team member with positive attitude.
Good verbal and written communication skills.
Qualifications:
Primary (Mandatory) Skills:
Good hands-on experience with Unix/Linux
Good hands-on experience in writing python codes
QA Methodologies understanding.
Secondary Skills (Good to have):
Experience in Big data platform & data analytics testing is an advantage.
Knowledge on distributed systems and technologies like Hadoop and spark.
[ctc] => 13 [ctc_to] => 21 [hide_salary_ctc] => Y [job_keyword] => 79,37394,37395,37396 [key_qualifications] => 2 [education] => 2,5,31 [specialization] => [additional_req] => [exp_from] => 1 [exp_to] => 4 [availability] => 3 [company_name] => PubMatic [company_email] => recruit@pubmatic.com [company_profile] => PubMatic (Nasdaq: PUBM) is an independent technology company maximizing customer value by delivering digital advertising’s supply chain of the future. PubMatic’s sell-side platform empowers the world’s leading digital content creators across the open internet to control access to their inventory and increase monetization by enabling marketers to drive return on investment and reach addressable audiences across ad formats and devices. Since 2006, our infrastructure-driven approach has allowed for the efficient processing and utilization of data in real time. By delivering scalable and flexible programmatic innovation, we improve outcomes for our customers while championing a vibrant and transparent digital advertising supply chain. [first_posted_on] => 2024-10-29 [last_date_apply] => 2025-01-22 [sponsor] => N [posted_by] => 1622 [posted_name] => Meghana Deshmukh [last_edited_id] => 0 [diversity_group] => 5 [disability] => [orientation] => [veterans] => [postretirementtype] => [ip_address] => 130.176.16.92 [created_on] => 2025-01-20 [updated_on] => 2025-01-20 [status] => Y [features] => ) ) [result_object] => Array ( ) [custom_result_object] => Array ( ) [current_row] => 0 [num_rows] => 1 [row_data] => ) 1We'll help connect you with opportunities that align with your goals and passions.
Please complete your profile for bettter matching of available job openings.
EDIT PROFILE