InfoBeans at Knowledge 26 Try Expona 2.0 Contact Us menu-bars menu-close
Team photo of leadership team before retreat outside

Let’s create WOW together

At InfoBeans, we believe in making other people’s lives better— through our work and everyday interactions.

Current Openings

ServiceNow SPM Architect
Pin

Pune – Viman Nagar, Indore CITP

Briefcase

ServiceNow.

Monitor

12+ years

Down Arrow

Role:

ServiceNow SPM Architect

Location:

Pune – Viman Nagar, Indore CITP

Experience:

12+ years

Key Skills:

CAD, CIS, ServiceNow CSA, SPM, ITOM

What will your role look like


Lead the design and architecture of ServiceNow Strategic Portfolio Management (SPM) solutions.

Drive end-to-end implementation of SPM modules including Demand Management, Project Portfolio Management (PPM), Resource Management, and Financial Planning.

Define and implement CSDM (Common Service Data Model) aligned architecture across the ServiceNow platform.

Oversee CMDB strategy, governance, and data integrity to support SPM and enterprise-wide initiatives

Establish and manage foundation data models, ensuring scalability and alignment with business processes.

Collaborate with business stakeholders to translate strategic objectives into scalable ServiceNow solutions.

Provide technical leadership, architecture governance, and best practices for SPM implementations.

Mentor development teams and ensure high-quality solution delivery

Integrate SPM with other ServiceNow modules (ITSM, ITOM, HRSD) and third-party systems.

Drive continuous improvement, platform optimization, and innovation.


Why you will love this role

Opportunity to lead enterprise-level digital transformation initiatives.

Work on cutting-edge ServiceNow SPM implementations with global stakeholders.

High visibility role influencing business strategy and decision-making.

Collaborative environment with strong leadership and growth opportunities.

Exposure to large-scale CMDB, CSDM, and data architecture challenges.


We would like you to bring along

12+ years of overall IT experience with strong ServiceNow expertise.

Proven experience as an SPM Architect / Solution Architect.


Deep expertise in:

ServiceNow SPM (PPM, Demand, Resource Management)

CSDM framework implementation

CMDB architecture & governance

Foundation data modeling & data strategy

Strong understanding of ServiceNow platform capabilities, integrations, and best practices.

Experience in enterprise architecture design and governance.

Excellent stakeholder management and communication skills.

Ability to lead large teams and drive solution ownership.


Good-to-have skills

ServiceNow certifications (CSA, CAD, CIS-SPM preferred)

Experience with ITSM / ITOM integrations.

Knowledge of Agile / SAFe methodologies.

Exposure to reporting & analytics (Performance Analytics).

Experience in data governance and enterprise data strategies.


Data Architect
Pin

Bangalore

Briefcase

IT Enabled Services.

Monitor

6+ years

Down Arrow

Role:

Data Architect

Location:

Bangalore

Experience:

6+ years

Key Skills:

OCI, Data Analytics Platform, ELT/ ETL, Medallion Architecture, Dimensional Modeling, AWS & Azure, Data Integration, Data Flow, Spark, Datalake house, Databricks

Job Description:

  1. As a Senior Data Engineering Data Architect, you will Senior Data Engineering Data Architect to lead the design and implementation of enterprise-scale data platforms.
  2. You will be working in projects which will need deep expertise in Oracle’s Modern Data Platform, cloud-native data engineering, and Medallion architecture principles.
  3. You will be Interacting with client to understand and gather requirement.
  4. You will architect scalable, secure, and high-performance data solutions that support advanced analytics, AI/ML, and real-time decision-making across the organization.

Desired Profile

1. Design and implement modern data platforms using Oracle AIDP (AI Data Platform), Oracle Cloud Infrastructure (OCI), Oracle Autonomous Database, and Oracle Data Integration tools.

2. Architect and operationalize Medallion architecture (Bronze, Silver, Gold layers) for structured and unstructured data pipelines.

3. Lead the development of data lakehouse solutions integrating batch and streaming data using tools like Apache Kafka, Spark, and Databricks.

4. Define and enforce data governance, metadata management, and data quality frameworks.

5. Proficiency in PySpark development is required including hands-on expertise in Spark SQL, DataFrame APIs and performance tuning of PySpark jobs along with experience with distributed computing, job orchestration.

6. Collaborate with business stakeholders to translate analytical needs into scalable data models and pipelines.

7. Mentor and guide data engineers, analysts, and architects across multiple teams.

8. Evaluate emerging technologies and lead proof-of-concept (PoC) initiatives.

9. Ensure compliance with security, privacy, and regulatory standards in data architecture.

10. Understanding of AI/ML concepts with hands-on experience in designing, developing, and deploying AI/ML models. Should have practical experience in applying machine learning techniques to real-world business problems.

11. Experience working with cloud-based ML platforms (e.g., OCI Data Science) is a plus. Exposure to large language models (LLMs), generative AI, and prompt engineering is desirable.

12. Any Industry Standard Certifications will be a plus

13. Good knowledge in Oracle database and development Experience in the database application.

14. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving

15. Building Effective Relationships, Customer Focus, Effective Communication, Coaching.

16. Ready to travel as and when required by project


Experience:

1. 6+ years of experience in Data lakehouse architecture and design with Medallion Architecture, dimensional modeling, ELT/ETL strategies. project experience in Data preparation for pipelines, integration with analytics platforms, Data lakehouse design, metadata management with preferable Platform as Oracle AIDP (AI Data Platform), Autonomous Database, Oracle Cloud Infrastructure (OCI), OCI Data Integration, OCI Data Flow, Oracle GoldenGate

2. 4+ years of experience in working on with PySpark development projects.

3. Extensive project experience with Databricks and Spark, preferably across OCI, AWS, Azure, Snowflake, and Hadoop, including at least two end-to-end lifecycle implementations.

4. Applied knowledge on Data cataloguing, lineage tracking, RBAC, Encryption,Compliance.

5. Good experience in programming tool such as – SQL, Python with good to have skills as – Java/Scala Apache Kafka, Apache Flume, Apache Airflow

Sr. Data Engineer
Pin

Bangalore

Briefcase

IT Enabled Services.

Monitor

5+ years

Down Arrow

Role:

Sr. Data Engineer

Location:

Bangalore

Experience:

5+ years

Key Skills:

Datalake house, Kafka, Oracle OCI, OCI, Medallion Architecture, PySpark, Python, SQL, Airflow, AWS REST APIs

What will your role look like: 

1. As a Senior Data Engineer, you will work on the design and implementation of enterprise-scale data platforms.

2. You will be working in projects to build scalable data pipelines and platforms using Oracle Cloud Infrastructure (OCI)and implementing Medallion Architecture for modern data lakehouse solutions.

3. You will be Interacting with client to understand and gather requirements.

4. You will be responsible for developing robust data ingestion, transformation, and curation pipelines that power enterprise analytics and AI/ML workloads.


Why you will love this role:

1. Develop and maintain data pipelines across bronze, silver, and gold layers using Medallion Architecture principles.

2. Build scalable and secure data lakehouse solutions on Oracle Cloud Infrastructure (OCI).

3. Implement ETL/ELT workflows using Oracle Data Integration tools (ODI, Data Transforms, GoldenGate).

4. Optimize data processing using Spark, Python, and SQL for batch and streaming data.

5. Collaborate with data architects, analysts, and business stakeholders to deliver high-quality datasets.

6. Ensure data quality, lineage, and governance across all layers of the data platform.

7. Monitor and troubleshoot data pipeline performance and reliability.


We would like you to bring along: 

1. 5+ yrs of Experience designing and implementing Build scalable and secure data lakehouse solutions with Medallion Architecture- Bronze (raw), Silver (cleansed), and Gold (curated) data layers.

2. Having domain experience of Oracle AI Data Platform, Oracle Autonomous Database, Oracle Cloud Infrastructure (OCI), OCI Data Integration (OCI DI), OCI Data Flow, Oracle GoldenGate, Oracle Data Transforms

3. Having 5+ years of project experience on Spark (PySpark), SQL, Python, Airflow, Kafka, REST APIs

4. 2 to 4 yrs of Dimensional modeling, Data profiling, Data Validation, Schema evolution, Log Extraction experience.

5. Applied knowledge on governance with RBAC, encryption, audit logging, metadata management.

6. Extensive experience in working on with PySpark development projects and designing data lake house systems with heterogeneous source systems and real time streaming.

Our mission

Our values

Our values are simple and the foundation of our culture stands on four pillars – Excellence, Ownership, Compassion, and Openness. Be it managing team expectations or customer experience; every process is led by our cultural pillars that are at the core of each aspect of the business.

Our values

Career Page
Life at InfoBeans

Community is key

Community photo Careers page no pink-min

Community is key

We believe in steady contributions to the environment and society that we live in. As a global technology leader, InfoBeans is committed to increase digital literacy and create a sustainable and self-reliant community.

contact-bg

Let's explore how we can create WOW for you!

Contact Us