SAP Knowledge Base Article - Public

3700095 - SAP datasphere Object Store usage is higher than expected

Symptom

In Datasphere System > Configuration > Tenant Configuration, for Object Store, the used storage has exceeded allocated storage and is higher than the size used in the data integration monitor.

Environment

SAP Datasphere

Reproducing the Issue

  1. Login to Datasphere tenant.
  2. Navigate to System > Configuration > Tenant Configuration.
  3. In the Object Store area, confirm the Used Storage is higher than expected, and it is higher than the size displayed in the data integration monitor.

Cause

As documented in Create a File Space to Load Data in the Object Store, the Object Store storage is calculated in following way:

The total amount of storage consumed by the file space includes: local tables (files) (total storage in MiB + buffer file size in MiB), logs, and backups of deleted objects (which are kept for 14 days after deletion). 

Also as explained in 'Object Store' section of Configure the Size of Your SAP Datasphere Tenant, there are two situations that can lead to higher consumption costs:

You may incur higher consumption costs because data lake files keep a previous copy of any file affected by an operation for a given retention time to allow for operations such as RESTORESNAPSHOT. These previous copies incur data lake storage costs. For example, you may have a 10 MB table, and the storage will be higher than that because of the number of operations initiated and copied. For more information, see Restoring data in Data Lake Files and Limitations of Data Lake files.

Storage is rounded to the next whole GB. For example, if all of the files in storage consume 1.2 GB, then the memory is rounded up to the next full gigabyte. In this example, it would round up to 2 GB.

Resolution

    If the used storage is way higher than expected usage, usually it is because there are too many historical data/record of local table (File). When data is updated, a new version is stored in the object store. Keeping all these versions consume a lot of memory, and can affect performance. To clean them up and free up used storage in the object store, execute the vacuum cleanup on all local table (file) as per Deleting Local Table (File) Records

    Keywords

    storage size discrepancy, object store, data integration monitor, storage calculation, vacuum cleanup, SAP datasphere, system monitor, storage consumption, local tables, logs, backups, deleted objects, data lake, exceeded storage, TB, API Calls, Block-Hours, VACUUM_FILES , KBA , DS-SM , Space Management , Problem

    Product

    SAP Datasphere all versions