Friday, March 28, 2025

Replacement of obsolete FM WWW_GET_MIME_OBJECT

Recently I noticed that FM WWW_GET_MIME_OBJECT was marked as obsolete.


Thus, Code Inspector reports it in case it is used in custom ABAP coding.

Function module WWW_GET_MIME_OBJECT is flagged as obsolete. Send any queries to the developer of the ==> Function module WWW_GET_MIME_OBJECT.


I was looking for a replacement function. I found FM WWWDATA_IMPORT. This FM has even interface similar to the WWW_GET_MIME_OBJECT. Below I introduce a call of both the FMs.

 

WWW_GET_MIME_OBJECT call:

DATA:  picture TYPE TABLE OF w3mime,

       w3_queries TYPE TABLE OF w3query,

       w3_htmls TYPE TABLE OF w3html,

      return_code TYPE w3param-ret_code,

      content_type TYPE w3param-cont_type,

      content_length TYPE w3param-cont_len.

 

    INSERT VALUE #( name  = `_OBJECT_ID`

                    value = `Z_SMW0_ENTRY`) INTO TABLE w3_queries.

    CALL FUNCTION 'WWW_GET_MIME_OBJECT'

      TABLES

        query_string        = w3_queries

        html                = w3_htmls

        mime                = picture

      CHANGING

        return_code         = return_code

        content_type        = content_type

        content_length      = content_length

      EXCEPTIONS

        object_not_found    = 1

        parameter_not_found = 2

        OTHERS              = 3.

 

WWWDATA_IMPORT call:

  DATA mime TYPE w3mimetabtype.

 

  DATA(key) = VALUE wwwdatatab( relid = `MI`

                                objid = `Z_SMW0_ENTRY` ).

  CALL FUNCTION 'WWWDATA_IMPORT'

    EXPORTING

      key                    = key

   TABLES

     mime                    = mime

   EXCEPTIONS

     wrong_object_type       = 1

     import_error            = 2

     OTHERS                  = 3.


Other option would be to replicate the function that the FMs are doing. In nutshell what both the FM are doing is to read the table WWWDATA via IMPORT ABAP command. Something like this:

  DATA key TYPE w3mimetabtype.
  key-objid `Z_SMW0_ENTRY`.

  IMPORT mime FROM DATABASE wwwdata(mi) ID key.

 

Friday, February 28, 2025

SAP Business Data Could (BDC)

On Feb 13th 2024 SAP hold an event (SAP Business Unleashed) where a new software-as-a-service (SaaS) solution in area of data management called SAP Business Data Could (BDC) was announced. The solution is marketed as an evolution of SAP capabilities of planning and analytics solutions like SAP Datasphere, SAP Analytics Cloud, and SAP Business Warehouse. All these are unified with Databricks’s capabilities of data engineering and machine learning/AI.

What is Databricks? It is a data lakehouse (portmanteau of "data warehouse" and "data lake") solution that unifies data warehouses and data lakes on one platform. By handling data, analytics, and AI workloads.

By mixing of SAP data management portfolio with Databricks customers get:

- Unified Data & Analytics to combine SAP & non-SAP data

- Scalability & Performance with Apache Spark and serverless

- Delta Sharing for data access across platforms, clouds and regions

- Robust AI & Machine Learning Tool Set

 

With this new announcement does it mean that SAP Datasphere is dead?

With Databricks now deeply integrated into SAP, does SAP Datasphere still have a role, or is it redundant? Business Data Cloud is rather the next iteration of DataSphere. However, it can turn out that some features will not be build in the Datasphere but rather in Databricks. Thus, for some use cases the Datasphere may become obsolete.

 

Release of SAP Business Data Cloud:

Version 1.0 General  1.0     Feb 13, 2025

 

More information:

Press release

Press release 2

BDC landing page

BDC product tour

Online docu

Databricks announcement

Databricks paper - Lakehouse: A New Generation of Open Platforms that Unify

Data Warehousing and Advanced Analytics

Thursday, February 6, 2025

BW transport issue: object locked despite transport is released

Recently I saw a situation where object was not able to be included into the transport. System was indicating that the objects is locked via pop-up message (TK117):

Object XXX locked for request/task <SID>Kxxxxxx

Choose ‘Display object’ or ‘Cancel’.



However, the object was not locked by any transport neither task. The transport that the error mentioned was released and moved across the landscape long time ago.


To solve this issue, I went to t-code SE03 then in menu Requests/Tasks -> Unlock Objects (expert tool) and via this I was able to unlock object.




Notice that 1st option – Unlock Object List needs to be selected. Transport Request/Task input field needs to be populated with the task no of the Transport Request in which the error message indicates that it is locked in.

 

Behinds the scenes what this t-code is doing is to remove a lock entry from table TLOCK (Change and Transport System: Lock Table). The table basically holds all the objects that are locked in the TRs. In my case for some reason while I released the TR the lock entry wasn’t removed from the table. And it is not known to me what was that reason…

 

Doing further research about the error I found also SAP Note 2537981 - SE13 | TK117 | Request lock on nonexistent task/request, rep ZSLA_DELETE_ORPHANED_TLOCK_2. The Note describes how to removed the lock entry (so called orphaned lock) for LIMU TABT objects.

 


Sunday, January 12, 2025

What is ABAP Cloud?

In 2023, ABAP – SAP’s proprietary programming language celebrated 40 years anniversary.  Nowadays there it is its latest evolution – ABAP Cloud. But what exactly it is?

The ABAP Cloud is the latest evolution of the ABAP programming language. It is modernized version of traditional language designed for cloud-native development in SAP Business Technology Platform (BTP) and as well as for hybrid setups where both on-premises (e.g. SAP S/4HANA editions) and cloud SAP systems coexist (side by side or extensions). It is the SAP's effort to bring their classic ABAP programming language and development tools into the cloud era.

It is not a new product but an evolution of the ABAP development model, enabling cloud-based solutions and extensions while adhering to modern development principles.

 

Features of the ABAP Cloud

·        Clean Core Principle: It enforces strict development guidelines that prevent modifications to standard SAP code, encouraging the use of extension points and APIs instead. This makes upgrades and maintenance easier.

·        Cloud-Optimized/Native support: It provides built-in support for cloud services, RESTful APIs, and modern web technologies while maintaining compatibility with traditional ABAP syntax. Focuses on principles of cloud computing, such as scalability, security, and high availability.

·        Restricted ABAP: Operates within a "restricted ABAP environment," which ensures that development aligns with cloud principles by enforcing the use of modern and safe ABAP practices. It prevents the use of older, system-dependent, or unsafe techniques. See ABAP Strict and ABAP language versions blog posts.

·        Development Model: It supports both on-stack (in the core system) and side-by-side (on SAP BPT) scenarios, allowing developers to create cloud-ready extensions and applications.

·        Core Elements:

·        ABAP Cloud incorporates essential components like:

o   Core Data Services (CDS) for data modeling and embedded analytics.

o   ABAP RESTful Application Programming (RAP) Model for business logic and service development. Programming model specifically designed for building modern, cloud-native business applications.

o   SAP Fiori and OData Integration: It is optimized for building SAP Fiori applications and RESTful services using OData. This aligns with SAP's UX strategy and the shift toward API-first development.

o   Cloud-optimized ABAP language syntax and tools.

·        Tools: Developers use ABAP Development Tools (ADT) in Eclipse as the integrated development environment (IDE).

·        Extensibility: Enables building of side-by-side extensions for SAP applications, such as SAP S/4HANA Cloud, without modifying the core application. This approach ensures that updates and upgrades to core systems are seamless.

 

Use cases of the ABAP Cloud

·        Modernizing legacy ABAP Code: Transitioning older ABAP developments to a cloud-ready format while adhering to best practices and SAP's guidelines.

·        Custom apps: Building entirely new applications to meet specific business needs, leveraging SAP BTP services.

·        Extensions: Extending the capabilities of SAP S/4HANA Cloud or on-premise systems without modifying the standard SAP codebase.

Steampunk vs. ABAP Cloud

It’s essential not to confuse ABAP Cloud with Steampunk (or Steampunk Embedded). The ABAP Cloud was initially introduced in steampunk or back then called “SAP Cloud Platform ABAP Environment” (approx. in 2018). Then the steampunk is internal name for the ABAP environment in SAP BTP, which offers a subset of classic ABAP functionality optimized for cloud development. Since then, the ABAP Cloud became also available in all SAP S/4HANA editions means it is present on the SAP S/4HANA stack.

Umbrella term 'ABAP Cloud' doesn’t mean development in the SAP BTP ABAP Environment. Rather it is more about cloud-ready development standards which also apply to on-premise or on-stack extensions to keep the core clean.

What to add? In a nutshell it is new cloud-ready ABAP development model. The ABAP Cloud is an SAP’s next step on a journey from classic ABAP to ABAP Cloud and perhaps, eventually to ABAP AI perhaps?

 

More information:

ABAP Cloud FAQ

ABAP Cloud what does it comprise

ABAP Cloud Technical use cases and recommended technologies


Saturday, January 4, 2025

Program RSPROCESS vs RSBATCH_EXECUTE_PROZESS

In one of my earlier posts, I described ABAP program called RSPROCESS. However, there is also a program RSBATCH_EXECUTE_PROZESS. Both are serving similar purpose in SAP BW systems. They are used to managing certain BW processes. As described in that other blog post the RSPROCESS executes process variant of process chain. For various types of process variant of the PC see this post or refer to table RSPROCESSTYPES.

On other hand the RSBATCH_EXECUTE_PROZESS program is used for managing BW processes in terms of individual batch process. It is often involved in specific data warehousing activities such as data loads, DSO (DataStore Object) activations, and parallel processing. RSBATCH_EXECUTE_PROZESS is typically scheduled as a background job to handle these tasks efficiently.

Selection screen of the RSBATCH_EXECUTE_PROZESS.



Automation for SAP BW configuration tasks

ABAP Task Manager (aka Task Manager for Technical Configuration) is part of the ABAP stack that serves for purposes of execution of an automation task lists for various configuration tasks in automated way. It is available in SAP ABAP Platform/ABAP NetWeaver Stack backend via t-codes STC01 (ABAP task manager for lifecycle management automation), STC02 (Task list run monitor). As an example of task lists that exists in the Task Manager are automation of following activities: system copy, post system copy (PCA – Post copy Automation), initial system setup, system check, FIORI setup, embedded search, etc. Of course, in SAP BW systems there are task lists also available for SAP BW tasks automation.



ABAP Task Manager is only the runtime for the execution of the automation task lists. For most of automation tasks there is also needed corresponding automation content, which are offered by SAP as well. The content contains the tasks itself. Below are listed few examples of the SAP BW related task lists:

 

SAP_BW_HOUSEKEEPING                         Activities associated with regular BW system maintenance           

SAP_BASIS_BW_OIA_CONFIG                    SAP_BASIS_BW_OIA_CONFIG       

SAP_BW_AFTER_MIGRATION                     Activities following the successful migration of a BW system 

SAP_BW_AFTER_UPGRADE                       Activities following the successful upgrade of a BW system           

SAP_BW_BASIS_COPY_INITIAL_CONFIG      Initial Copy for BW and BW Source Sys – Cleanup and Configuration

SAP_BW_BASIS_COPY_REFRESH_CONFIG    Sys Refresh of BW/BW Source Systems Export/Cleanup/Import/Conf

SAP_BW_BEFORE_MIGRATION                    Activities prior to the migration of a BW system       

SAP_BW_BEFORE_UPGRADE                       Activities prior to the upgrade of a BW system         

SAP_BW_COPY_INITIAL_PREPARE               Preparation for Initial Copy of BW System   

SAP_BW_SETUP_INITIAL_CONFIG               BW Initial Setup Task List 

SAP_BW_SETUP_INITIAL_S4HANA              BW Initial Setup Task List for S4/HANA

SAP_BW4_TRANSFER_CHECK_CLOUD_RMT  Collect and Check BW objs whether they are compatible to BW Bridge

SAP_BW4_TRANSFER_CHECK_CLOUD_SHL   Collect and Check BW objs whether they are compatible to BW Bridge

SAP_BW4_TRANSFER_CHECK_INPLACE        Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CHECK_REMOTE        Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CHECK_SHELL           Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CLOUD_REMOTE        Activities to be performed in original sys of remote BW4Cloud-Transfer

SAP_BW4_TRANSFER_CLOUD_SHELL           Activities to be performed in original sys of shell BW4Cloud Transfer

SAP_BW4_TRANSFER_INPLACE                   Transfer BW objects to be compatible to BW/4

SAP_BW4_TRANSFER_READY4CONV            Transfer IOs & Open Hub Destinations to be compatible to BW/4

SAP_BW4_TRANSFER_REMOTE_PREPARE     Activities to be performed in original system of remote BW4-Transfer

SAP_BW4_TRANSFER_REMOTE_SHELL         Activities to be performed in original system of shell BW4-Transfer

SAP_BW4_TRANSFER_SEM                         Tasks to Transfer SEM-BW objects to be compatible to BW/4

SAP_BW4_TRANSFER_SEM_SHELL              Transfer SEM-BCS objects & BW objects w/o data into remote sys

 

While the task list is being executed the BW system triggers job within following naming convention: BW_TASK e.g. job BW_TASK_20241105082934000005000

 

More information:

Automated Initial Setup of ABAP Systems Based on SAP NetWeaver

ABAP Post-Copy Automation for SAP BW Configuration Guide

1829728 - BW housekeeping task list

3349077 - [BW Central KBA] Systemcopy / Refresh


Tuesday, December 31, 2024

SAP Change Data Capture (CDC)

Change Data Capture (CDC) allows identifying and capturing changes made to data within SAP application tables with the SAP system. Those data changes can be then in real-time replicated to other systems or processes. It's a way to keep data synchronized across different environments, enabling real-time analytics, data warehousing, and other data-driven initiatives.

There are different techniques for implementing CDC in SAP environments:

Log-Based CDC: This approach reads the transaction logs of the SAP database to identify changes. It's generally considered the most efficient method with minimal impact on the source system.

Trigger-Based CDC: This method uses database triggers to capture changes as they occur. Triggers are database objects that automatically execute a predefined action in response to a specific event (e.g., an insert, update, or delete operation).

Table-Based CDC: This technique involves comparing snapshots of tables at different points in time to identify changes. It's less efficient than log-based or trigger-based methods but can be used in situations where those options are not available.

SAP has several tools and technologies that leveraging CDC, including:

SAP Data Services: A data integration and data quality platform that includes CDC capabilities.

SAP Landscape Transformation (SLT) Replication Server: A tool for real-time data replication from SAP systems to SAP HANA.

There are many 3rd party tools that uses SAP CDC as well.


Now when it comes to extracting the data via CDS (Core Data Services) aka ABAP CDS views the trigger-based CDC is used as well. Specific annotation for the CDS view needs to be specified in ABAP Development Tool (ADT Tool). The annotations allow the CDS view to use a trigger-based CDC recording mechanism to load data and record changes to the tables that belong to the view.

PS: Do not confuse SAP CDC (Change Data Capture) with other SAP products like SAP Customer Data Cloud.

Mor information:

Online documentation


Sunday, December 29, 2024

Exposing SAP data with a Note 3255746 in mind

In a nutshell with an update of the Note 3255746 SAP stops customers and 3rd party applications from using RFC modules in the ODP Data Replication API to access and extract SAP data from ABAP sources. As these APIs are only for SAP’s internal use. This is an SAP's ban of RFC usage.

This means it is not possible to use ODP to access ABAP CDS views directly. Instead, the ODP component with OData API provided by SAP for data extraction should be used. This API is stable and recommended for all customer and third-party applications. Furthermore, SAP advises customers “to use SAP Datasphere for realizing data replication scenarios to move data from various SAP sources (such as SAP S/4HANA, SAP BW, SAP ECC sources etc.) into third-party applications & tools.”

Restriction was introduced on February 2nd, 2024, when SAP updated the Note mentioned. Note originated sometime around year 2022 when it stated that “ODP RFC method were unsupported”. But 2024 Feb update of the Note made unsupported -> unpermitted.

Impact on 3rd party tools / Target platforms

SNP/DataVard Glue - no impact as the tool uses an technology that is independent of any third-party (SAP's in this case). More info.

TheoBald Xtract – impacted, Theobald claims it will enhance its tool to use OData API (as well as it is currently utilizing the RFC modules) sometime around 2024.Q4. More info.

Init Software – Init ODP Source Connector is impacted. Use Init OData v2 Source / Sink Connector More info.

Databricks – impacted, as their lakehouse ingests data from many sources More info.

Azure Data Factory (ADF) / Azure Synapse Analytics - data ingestion platforms of MS Azure is impacted by this as Azure Data Factory SAP CDC connector is using the ODP framework. Use OData API based connector instead that is available in Azure. More info.

Qlik Replica – impacted, More info.

Google Cloud Data Fusion – As there are many GCP connectors for SAP (e.g. BigQuery, SAP BW Open Hub Batch Source, SAP OData, SAP ODP, SAP SLT Replication, SAP Table Batch Source) some of them are impacted. Specific case needs to be reviewed and a new integration needs to be built based on OData API if impacted. More info.

AWS Appflow – There may be a several connectors available in AWS Appflow. In case of “Amazon AppFlow SAP OData connector” usage there is no impact as it leverages OData API.

HVR/Fivetran - Fivetran's SAP NetWeaver connector uses RFC calls thus it is impacted. Most likely there will be a new Fivetran ODP connector released soon to be based on ODP OData API. More info.

Snowflake – depends on which tool is used to replicate the data to the snowflake.

Notice that there are many other tools used to expose SAP data that are not listed in here.

Also, the big question if this action done by SAP is legally binding its customers. Well, it is a big topic actually. What is driving it is the SAP Software Use Rights signed. There is a standard version of it (but also notice that the document is a subject to evolution and depends on the SAP version, etc.) that clearly says that asynchronous indirect access to SAP data is not licensed without SAP Digital Access, BW or OpenHub. Thus, one can believe that even SAP puts auditing/tracking mechanisms in place and they prove by that that particular customer is violating the Note 3255746 they can’t do much about it. But again, a disclaimer – this is not a legal advice at all! Always consult with your SAP account manager also reach out for professional help of e.g. companies providing SAP licenses consulting.

 

More information:

3255746 - Unpermitted usage of ODP Data Replication APIs

ODP-Based Data Extraction via OData

Guidance for Partners on certifying their data integration offerings with SAP Solutions

Thursday, October 31, 2024

How to clear cache in SAP Analysis for Microsoft Office

When SAP Analysis for Microsoft Office (AfO) is being reinstalled or upgraded there can be several errors popping up causing not possible to reuse/refresh the existing reports. Basically, the AfO is crashing or freezing with errors like following.

 

"An exception occurred in one of the data sources. SAP BI Add-in has disconnected all data sources. (ID-111007)"

"Nested exception. See inner exception below for more details."

 

Root cause of the errors like those is that when the AfO is upgraded/uninstalled, the process process does not clear the cache of the AfO. To correct it the cache needs to be cleared manually.

First folder of the cache to look for to be cleared is:

"c:\Users\<USER_ID>\AppData\Roaming\SAP AG\SAP BusinessObjects Advanced Analysis\cache\"

In this folder the cache files located that falls under a naming convention like this:

<SID>.cache

 

Other folder to look for is COF (Common Office Framework) directory out of %APPDATA% that is accessible under link:

"%APPDATA%\SAP\Cof"

It that points to folder:

"c:\Users\<USER_ID>\AppData\Roaming\SAP AG\SAP BusinessObjects Advanced Analysis\cache\"

 

Once the cache is cleared start the AfO from Windows Start menu (All Aps -> SAP Business Intelligence -> Analysis for Microsoft Office) and it should be possible to refresh AfO reports.

 

More information:

2979452 - An exception occurred in one of the data sources. SAP BI Add-in has disconnected all data sources [1e04-3ff5-15]

AfO wiki

Friday, October 25, 2024

Scheduling process Chain in alternate time zone

There is an option to run the process chain in different time zone available in BW systems. It is available in start variant of the PC. There is a new checkbox “Use Alternative Time Zone”. Once it is checked a new field shows up.

The feature can be useful in case BW admin is not aware of what time zone the BW system runs in. So, the alternative time zone can be used.

Once there is an alternative time zone specified for specific PC’s variant it is saved in table RSPCTRIGGER and in field TMZONE.

Same functionality is leveraged in SAP standard background jobs.



Monday, September 30, 2024

Different product lines of SAP BW

In some cases there is a confusion about versions of SAP BW which were introduced during all the years (1st version appeared up around 1998). Let me briefly sort this out. This blog post is not that comprehensive but tries to put a naming conversion of the major releases of the BW straight.

 

1. SAP Business Warehouse Classic (classic BW) aka SAP NetWeaver based Business Warehouse (component SAP_BW), runs on any DB, see details:

SAP Business Warehouse 3.5 part of SAP NetWeaver 04

SAP Business Warehouse 7.0 part of SAP NetWeaver 2004s (NW’04s) aka NetWeaver 7.0

SAP Business Warehouse 7.3

SAP Business Warehouse 7.4

SAP Business Warehouse 7.5

These versions of the BW are sometimes referred as SAP NetWeaver all versions aka BW 7.x

 

2. SAP Business Warehouse powered by SAP HANA aka BW on HANA (component SAP_BW), runs on SAP HANA DB only, see details here

 

3. SAP BW4/HANA (component DW4CORE), see details here or here, BW4/HANA was based on BW 7.5 but redeveloped, many components were removed, and it is not based on NetWeaver stack anymore.

 

If there is a term classis BW used, what is meant by that is the BW based on SAP NetWeaver stack. Means all the versions starting with 3.5 up to BW on HANA including. However difference between 7.x and BW on HANA is that 7.x supports any database but the BW on HANA runs on HANA DB only.

 

More information:

Short history of SAP BW

SAP BW/4HANA (B4H) versions


Monday, August 12, 2024

SAP S/4HANA Cloud Public vs Private Edition?

SAP S/4HANA Cloud is an enterprise resource planning (ERP) suite offered by SAP, and it comes in two primary deployment options: Public Edition and Private Edition. Each offers different features, levels of customization, and deployment flexibility to cater to various business needs. In general, below is a breakdown of the differences between the two:


SAP S/4HANA Cloud Public Edition is better deal for organizations that want a standardized, effective, and quickly deployable ERP solution with minimal customization needs.

On other hand its Private Edition is better suited for organizations that require a highly customized ERP environment, need control over their system updates, and are willing to invest in a more flexible and powerful deployment model.

3 Tier Model to get to ABAP Cloud

Customers who want to migrate tier SAP ERP systems to a cloud need to embrace the cloud from ABAP perspective too. This shift is needed as within on premise SAP ERP systems Classic ABAP extensibility options (user/customer exits, BADIs, enhancement points, modifications, append, structure, menu exits, etc.). All of these were used to tailor SAP systems to meet specific business requirement. But since the introduction of cloud the Classic ABAP extensibility options are not supported anymore in the cloud bases SAP ERP systems.

Apparently majority of SAP customers won’t start with move to the cloud by having a new greenfield implementation of ERP system like S4/HANA Could is. Therefore, SAP had to come up with something that would enable the cloud transition for the existing customer running their ERP systems on premise. It is a 3-tier Extensibility Model. Its purpose is to enable the transition from Classic ABAP to ABAP Cloud. It is also intended to manage a coexistence of these different extensibility models.

Remember much used term - "clean core"? Well as it means up2date, transparent, unmodified SAP system. All these adjectives describe a system that is cloud compliant. Reason why it is important is that as in the cloud all the customers use same base code line and changes are applied to all customers simultaneously. Therefore, there is a no way to allow each individual customer to implement enhancements in the same way that they could in their on premise systems.

 

Tier 1 – Cloud development: default choice for all new extensions and new custom applications following the SAP S/4HANA Cloud extensibility model. Goal is to get to this tier form the lower tiers.

 

Tier 2Cloud API enablement / API layer: if there are any objects (BAPI, Classes, Function Modules, Core Data Services) that are not yet released by SAP and are required in tier 1 a custom wrapper is created for them. By this a missing local public APIs or extension points are mitigated. The custom wrappers are built and released for cloud development. Once there is SAP released public local API, the custom one can be set as obsolete and removed. ABAP Test Cockpit (ATC) can be leveraged inhere to force ABAP Cloud guidelines. Also via ATC exemptions violation of the ABAP Cloud rules can be managed.

 

Tier 3Legacy Development / classic ABAP development: classic extensibility based on classic ABAP custom code that is not supported in the ABAP Cloud development model. E.g. BAPIs, user exits, modifications, SAP GUI, file access, reports writing to GUI, etc. The goal is to avoid developments in this tier and follow the ABAP Cloud development mode. However, as the customer is at this stage the classic objects are to be modernized and moved to the tier 1. Those need to be refurnished one-by-one there is no any tool for that.

 

Now when it comes to real (re)development of the objects in the particular tiers. A concept of software components is used in here. By creating its own component, the object is separated from the others (e.g. non-clean core components – remember clean core). This is because the component puts stricter ABAP Cloud rules to the objects thus separation is needed.

For all the details how to work with the object within specific tier follow below SAP official guidelines.

 

More information:

Clean Core

ABAP Cloud API Enablement Guidelines for SAP S/4HANA Cloud, private edition, and SAP S/4HANA - overview

ABAP Extensibility Guide - overview

ABAP Cloud - How to mitigate missing released SAP APIs in SAP S/4HANA Cloud, private edition and SAP S/4HANA – The new ABAP Cloud API enablement guide

SAP S/4HANA Extensibility: All You Need to Know