Job Role: TIBCO Architect/SME (Subject Matter Expert) Experience required: 8+ years Job-Location: Riyadh, Saudi Arabia
We are seeking an experienced Data Engineer – SME / TIBCO Architect to lead the design, development, and optimization of enterprise-level data integration and messaging solutions. The ideal candidate will have deep expertise in TIBCO technologies (Business Works, EMS, BWCE, BE, API management, etc.), strong data engineering skills, and a proven ability to architect scalable, secure, high-performance integration ecosystems. This role combines technical leadership, solution architecture, and hands-on engineering to support mission-critical business processes. Key Responsibilities Architecture & Design Architect end-to-end TIBCO-based integration and data engineering solutions. Design scalable, fault-tolerant, event-driven, and service-oriented architectures. Develop integration patterns using TIBCO Business Works, EMS, FTL, BWCE, and APIX/ Mashery. Evaluate existing systems and define modernization roadmaps (cloud, microservices, containerization). Data Engineering Design and build robust data pipelines for structured, semi-structured, and unstructured data. Implement ETL/ELT workflows, streaming pipelines, and real-time data processing. Optimize data flows for performance, reliability, and cost efficiency. Integrate data solutions with cloud platforms (AWS, Azure, GCP) and modern data stacks. Technical Leadership Serve as the subject matter expert for TIBCO technologies and enterprise integration strategy. Provide technical guidance, best practices, and mentorship to engineering teams. Collaborate with business stakeholders, architects, and developers to translate requirements into technical solutions. Lead design reviews, performance tuning, troubleshooting, and production support. Development & Implementation Build and configure TIBCO components including BW/BWCE, EMS, RV, BE, Active Spaces, and TIBCO Adapters. Develop APIs, orchestrations, and microservices using TIBCO and complementary technologies. Implement CI/CD pipelines, infrastructure as code, and automated deployment frameworks. Ensure compliance with enterprise security, data governance, and Dev Ops standards. Support & Optimization Monitor integration ecosystem performance and identify opportunities for improvements. Lead incident analysis, root-cause investigations, and system remediation. Optimize messaging, data flows, and API performance for high-throughput applications. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.10+ years of experience in enterprise integration or data engineering roles.7+ years of hands-on experience with TIBCO technologies, including:Business Works / BWCEEMS / FTLTIBCO BETIBCO Adapters Mashery / APIXActive Matrix / SFG (optional) Strong expertise in data engineering:ETL/ELT, streaming pipelines, and data modeling SQL/NoSQL databases (Oracle, Postgres, Mongo, Cassandra, etc.) Big Data tools (Hadoop, Spark, Kafka, Flink) Strong proficiency in Java, Python, or Go, and scripting languages (Shell, Groovy). Experience with Docker, Kubernetes, CI/CD pipelines, and Git-based workflows. Cloud experience (AWS / Azure / GCP) for data integration, messaging, and APIs. Deep understanding of integration patterns, SOA, microservices, and event-driven architecture. Excellent communication, documentation, and stakeholder management skills. Preferred Qualifications (Not mandatory) TIBCO certifications (Business Works, EMS, FTL, API Management). Experience migrating from legacy TIBCO stacks to cloud-native environments. Exposure to Snowflake, Databricks, or cloud data warehousing platforms. Knowledge of MDM, Data Governance, and metadata management tools. Experience in Agile/Scrum methodology.