SAP Knowledge Base Article - Public

3707272 - Understanding batch sizing in import jobs from Datasphere to SAP Analytics Cloud

Symptom

When performing data imports from SAP Datasphere to SAP Analytics Cloud, the following is observed:

  • Poor import performance: The data transfer takes significantly longer than expected.
  • No batch size option: Unlike other OData sources, the option to manually define the Batch Size is unavailable in the SAC query configuration.

Environment

SAP Analytics Cloud Enterprise 2026

Cause

The batch size is not a static value that can be defined in SAC. Instead, it uses server-side pagination, allowing Datasphere to determine the optimal batch size based on resource optimization.

The calculation is defined by two primary constraints:

  1. Record Limit: There is a hard limit that limits the maximum batch size to 50,000 records in all cases.
  2. Payload Size Limit: The total response size for a single page is capped at 50 MB.

Based on these constraints, the calculation logic to determine the number of records per page considers the maximum potential size in bytes of a single row. This calculation is based on the metadata of all columns selected in the request, rather than the actual data stored in those columns. The formula used is (example):

Batch size = 50 MB (payload limit) / Max potential row size (bytes)

If a model contains columns with unnecessarily large definitions (e.g., String(5000)), the "max potential row size" increases. This forces the batch size to decrease to stay under the 50 MB payload limit.

To confirm the exact batch size being used by the system during an active import, inspect the SQL queries being executed in the HANA database. Access the HANA Database Explorer and look for the SQL statements sent from Datasphere to the underlying HANA database. Look for the specific pagination syntax in the query:

ASC LIMIT <Number> OFFSET ?

The value following LIMIT is the dynamically calculated batch size based on the model configuration at the moment the import was triggered.

Note: The formula is an example to illustrate how the batch size is calculated. The exact internal calculation may vary in format.

Resolution

To increase the batch size, the "potential size" must be reduced. To achieve this, do the following:

  • Define column types and lengths properly:
    • Ensure that the length of the string columns matches the actual data requirements. For example, if a field only ever contains 10 characters, define it as String(10) rather than String(5000).
  • Control the metadata:
    • Review the underlying Datasphere model and reduce the allocated size of columns to the minimum necessary.
  • Select only necessary columns:
    • Only select the columns strictly necessary for the SAC model. Since the calculation considers all columns selected in the request, reducing the number of columns in the SAC import job can decrease the "Max Potential Row Size", which increases the number of records per batch.

See Also

Your feedback is important to help us improve our knowledge base.

Keywords

SAP Analytics Cloud, SAC, SAP Datasphere, DSP, OData, Import Data, Batch Size, Performance, Latency, Throughput, Pagination, Server-side pagination, 50MB, limit, records, Column Definition, Metadata, String Length, HANA Database Explorer, SQL Trace, LIMIT, OFFSET, Data Integration, System Integration, Model Optimization. , KBA , LOD-ANA-AQU-ODATA , Acquiring Data into SAC using an ODATA connection , LOD-ANA-AQU , Import Data Connections (Acquiring Data) , Bug Filed

Product

SAP Analytics Cloud 1.0