What is OSLC - OSLC Overview
Most of you who are reading this blog post are probably working in systems engineering or consulting in software engineering and want to understand the basic concepts and principles of OSLC. This series of blog posts will cover various aspects of OSLC as how I understand it through my own learning.
2. RELEVANT BACKGROUND
2.1 Conflicts between disconnected information and cross-lifecycle collaboration
We think that OSLC was created to address systems integration issues, so we need to start at the source of systems integration.
The process of enterprise informational/digitization is not something that can be achieved overnight but through a gradual and iterative process. Based on the needs of enterprise R & D business, the digitization of each life cycle stage is gradually put on the agenda. However, the product lifecycle spans multiple engineering domains, typically requirements management, configuration management, quality management, change management, and so on. It is a complicated and long process to realize the rapid automation of "one click". Therefore, the process of enterprise digitization tends to start from one or several engineering fields, and then improve gradually through iteration.
The implementation of digitization cannot be achieved without the support of software tools. Therefore, in the process of digitization, enterprises generally purchase and implement a series of commercial or open-source software tools based on comprehensive factors such as business and cost. The development of tools is often business-driven, designed to solve problems existing in a specific domain. As a result, there is often a natural isolation relationship between multi-domain tools. As a result, many information spaces are naturally isolated from each other.
Even though most digital tool developers will create platform products to achieve effective collaboration between multi-domain tools based on the same platform, the tool platform under this background is often limited in many aspects in terms of function. The first is cost. Traditional tool developers tend to be strong in one or more specific areas, such as Bentley in 3D modelling, IBM in configuration management, requirements management, and so on. The platform strategy may not be limited to what the developer is good at, it is related to the specific platform strategic planning and market drivers. As a result, developers must invest a lot of manpower in platform tool development. This brings its own costs and risks. Second, is the product release cycle. Different engineering fields often have a large share of the market or recognition in their respective industry, the competition is extremely fierce. How quickly you get your tool to market is one of the determining factors for success.
As shown in the figure below, many software tools have the following features:
In summary, each domain tool plays a major role in its own domain, but the data and workflow between domain tools are separated from each other, no data sharing and inconsistent workflow in the whole project life cycle. This leads to the formation of an internal information silos.
As the maturity of an enterprise increases, it is inevitable that managers will focus on improving the ability of R&D management. Information silos in separate areas are a barrier to further development.
2.2 Traditional Methods of System Integration and Their Drawbacks
There are various ways to address the issue of information silos, but the most practical and commonly used method is "P2P" integration. P2P integration involves utilizing the APIs provided by existing tools to enable data exchange between tools through extension development. P2P integration offers direct and flexible integration capabilities.
While this approach is limited by the API capabilities of the tools involved, it allows for convenient integration between two tools based on business requirements, as long as the API functionalities permit it. It eliminates the need to consider additional details such as generality and re-usability. However, there are notable disadvantages to P2P integration.
Data Replication instead of Linking: In typical cases, P2P integration involves data replication, which can lead to data redundancy and inconsistency.
High Development Costs: Developers need to understand the integration mechanisms of both tools, including platforms, languages, and APIs. This requires significant upfront research and learning costs.
High Maintenance Costs: If a tool replacement or version upgrade occurs, changes or instabilities in the API will inevitably impact existing integration work, greatly increasing maintenance costs.
Poor Scalability: Introducing new tools results in an exponential increase in integration points, leading to heavy costs.
Low Re-usability: P2P integration relies on specific tool-specific APIs and tightly couples with the tools, making efficient reuse difficult to achieve.
3. EMERGENCE OF OSLC
3.1 What is OSLC?
OSLC, short for "Open Services for Lifecycle Collaboration," is a set of technical specifications proposed by OASIS. These specifications primarily address the integration challenges of lifecycle tools. The OSLC specifications consist of core specifications and domain specifications. The core specifications describe the core integration technologies and general concepts, while domain specifications focus on specific engineering domains.
Domains refer to familiar areas in traditional software engineering, such as requirement management, configuration management, quality management, asset management, and change management. The development of OSLC specifications is carried out by working groups within the OSLC community. Depending on the specific domain of the established specification, working groups can be divided into core working groups and domain working groups. As the names imply, core working groups focus on developing core specifications, while domain working groups focus on developing OSLC specifications for different engineering domains.
3.2 Analysis of OSLC Technical Specifications
3.2.1 Core Idea of OSLC - Linked Data
The core idea of OSLC is "Linked Data," which follows the rules outlined in the following (source: http://www.w3.org/DesignIssues/LinkedData.html):
1. Use URIs as names to identify things 2.
2. Use HTTP URIs to enable users to look up those names.
3. When users look up a URI, provide useful information in standard formats (RDF*, SPARQL).
4. Include links to other URIs to help users discover more information.
The rules of "Linked Data" can be summarised as follows:
Things are identified using HTTP URIs, and users can retrieve useful information about these names through requests in standard forms. The inclusion of links to other URIs allows users to discover more information.
OSLC is based on this fundamental idea, transforming artifacts in the software development lifecycle into resources, such as a requirement, a test case, or a development plan, all identified by HTTP URIs. Users can access these resources through HTTP requests. OSLC mandates that resource representations must support RDF and can also support other resource formats such as JSON/HTML.
The OSLC core specifications define simple HTTP and RDF usage patterns and minimal resource types to ensure tool integration. The domain specifications in OSLC build upon the core technologies defined in the OSLC core specifications and define resource representations for specific domains.
Integration Technologies in OSLC:
OSLC exists to solve the problem of lifecycle tool integration. How does it achieve this at a specification level?
OSLC provides two main integration technologies: HTTP CRUD-based integration (Linking data via HTTP) and HTML UI-based integration ("Linking Data via HTML User Interface").
1. HTTP CRUD-based Integration:
OSLC realizes C.R.U.D. (Create, Read, Update, Delete) operations on resources through standard resource representations and the HTTP protocol.
2. HTML UI-based Integration:
In addition to supporting basic data operations, OSLC introduces a novel integration approach known as "seamless UI integration." The UI integration methods defined by OSLC include "UI Preview" and "Delegated UI." "UI Preview" is mainly used for data preview purposes, while "Delegated UI" is used for artifact selection and creation.
3.3 In-depth Analysis of OSLC Core Specifications
3.3.1 Service Discovery
Service discovery is an essential and sometimes challenging feature of OSLC. In the OSLC technical specifications, service interfaces are not identified in the form of "fixed APIs." Instead, they are discovered layer by layer by the client. From the client's perspective, it only needs to know the entry point of the basic service to gradually discover the required services based on the OSLC protocol.
3.3.3 Resource Query
OSLC specifications define query mechanisms for complex queries, enabling clients to flexibly query remote resources. For example:
http://example.com/bugs?oslc.where=cm:severity="high" and dcterms:created >"2010-04-01"
3.3.4 Delegated UI Dialogue
Delegated UI Dialogue is typically used in integration scenarios where users want to select and link resources from Tool A in Tool B.
3.3.5 UI Preview
UI Preview is primarily used to address the issue of cross-tool data preview. For example, let's consider a scenario where there is a link between test cases in Test Management Tool A and requirements in Requirement Management Tool B:
In this scenario, users expect to view the information of the associated requirement directly from the test management tool without having to access the requirement management system separately. To fulfil this requirement, we have a "data preview." The UI Preview in OSLC is designed to satisfy this scenario. The integration scenario based on UI Preview is shown in the diagram above.