Adobe AEP — Solving DCVS ingestion errors
What are ingestion errors?
In Adobe Experience Platform (AEP), ingestion errors can occur while importing data, either in batches or streamed. There is no special reason to give the two ingestion methods different treatments.
Any dataset can be affected, here we shall consider only the Web Event Dataset, as this is where problems are concentrated in day-to-day operations. The current client has 62 datasets.
The dataset “prod — Web Event Data’ is associated with our schema constructed from field groups: Adobe Analytics ExperienceEvent Template, Adobe & Full extension, plus several other field groups added in a cumulative ad-hoc fashion.
Ingestion errors should not be confused with dataflow run errors:
These are also ingestion errors, but not of a type relevant here. Dataflow errors are documented.
This article covers DCVS errors which are completely undocumented.
Which Adobe products are affected?
RT-CDP & CJA are principally affected since CJA is an application on the platform. Looking at the overview diagram, AJO ought to be afflicted too.
Adobe Analytics which is much more forgiving, its data is unaffected.
In AA, minor data errors or inconsistencies can be handled, in RT-CDP entire batches can be rejected!
What is the consequence of failure to ingest?
Missing data will result in:
- Incomplete customer profiles
- Impaired segmentation and targeting
- Reduced personalisation effectiveness
- Delayed or inaccurate analytics (within CJA or RT-CDP Audience sizes)
- Activation issues
- Compliance and governance risks (if consent information is included)
- Reduced ROI
- Impaired ML & AI Capabilities
- Operational Difficulties
In simple layman's terms:
- The customer may be treated as abandoned when they have purchased.
- Guests may be nudged to re-engage to complete an action which they have already completed
- Nudging fans to complete post-event surveys which they have already completed
- Retargeting those via display advertising for actions already completed
- Sending emails eg abandoned cart emails for actions already complete
- Treating the customer as opt-in when they are opted out
…. All because the data points were not ingested into the platform!
Ingestion error codes
There are many codes, however, the vast bulk of error codes seen are of type DCVS. There is no mention of these codes on Experience League, only some chatter in communities, one of which is mine! Google/Perplexity reveal no hits!
DCVS-1101–400 (null found) (rarer)
This appears to mainly around null values:
Examples:
The message cannot be validated: [#/_clientnamespace/product/0/businessGroup: expected type: JSONArray, found: String]. The message cannot be validated:
[#/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/2/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/2/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/0/businessGroup: expected type: JSONArray, found: String]. The message cannot be validated:
[#/_clientnamespace/product/1/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/1/businessGroup/0: expected type: String, found: Null]. The message cannot be validated:
[#/_clientnamespace/product/1/businessGroup/0: expected type: String, found: Null].
We will refer to these as ‘Type 1’
DCVS-1103–400 (pattern mismatch)
Very rarely seen.
The message cannot be validated due to the pattern mismatch error:
#/commerce/order/currencyCode: string [????] does not match pattern ^[A-Z]{3}$.
We will refer to this as ‘Type 3’
DCVS-1104–400 (mismatch)
This error indicates that there’s a mismatch between the data type of a field in your incoming data and the data type defined in the XDM schema.
DCVS-1104–400 The message cannot be validated due to the data type error: #/commerce/order/priceTotal: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems: expected type: JSONArray, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_experience/analytics/event101to200/event184/value: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup: expected type: JSONArray, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_experience/analytics/customDimensions/eVars/eVar57: expected type: String, found: Integer.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/0/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/0/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_experience/analytics/event101to200/event182/value: expected type: Number, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_experience/analytics/customDimensions/eVars/eVar57: expected type: String, found: Integer.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/0/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_experience/analytics/customDimensions/eVars/eVar89: expected type: String, found: JSONObject.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/0/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/marketing/trackingCode: expected type: String, found: JSONArray.
DCVS-1104–400 The message cannot be validated due to the data type error: #/_clientnamespace/product/0/businessGroup/0: expected type: String, found: Null.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/5/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/1/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/1/quantity: expected type: Number, found: String.
DCVS-1104–400 The message cannot be validated due to the data type error: #/productListItems/5/quantity: expected type: Number, found: String.
We will call this ‘Type 4’
DCVS-1106–400 (required field missing)
This error suggests that a required field in the XDM schema is missing from your incoming data.
The message cannot be validated because a required property is missing: #/_experience/analytics/event101to200/event110: required key [value] not found.
The message cannot be validated because a required property is missing: #/productListItems/0: required key [SKU] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event80: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event25: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event26: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event19: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event18: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event101to200/event114: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event19: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event25: required key [value] not found.
The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event18: required key [value] not found.
We will refer to these as ‘Type 6’
Datatype Basics — a refresher
Integers
{ "productPrice": 199.99 }
Strings
{ "productName": "Smartphone" }
dateTime
{ "purchaseDate": "2024-09-19T15:30:00Z" }
Boolean
{ "isAvailable": true }
Ok that's the simple stuff, now it gets involved:
array of strings
{ "ProductMarkets": ["US", "EU", "Asia"] }
Object
{ "product": { "SKU": "12345", "description": "High-quality smartphone" } }
Datatypes II — now it is getting hairy
This is where we experienced difficulty.
An array of strings found in our JSON sample file (see next section)
{
"businessGroup": ["Sample value"]
}
Objects
{
"product": {
"SKU": "Sample value",
"businessGroup": ["Sample value"],
"description": "Sample value"
}
}
JSON Nested Objects
{
"webInteraction": {
"URL": "Sample value",
"linkClicks": {
"id": "Sample value",
"value": 29138.61
}
}
}
Array of objects, An array of objects is found in fields like product
, where each object contains multiple properties:
{
"product": [
{
"SKU": "Sample value",
"business": [
{
"businessID": "Sample value",
"businessName": "Sample value"
}
],
"businessGroup": ["Sample value"],
"description": "Sample value"
}
]
}
Using the sample file
This will allow for experimentation and testing of files and formats, seeing which errors are generated.
Processing times on an empty sandbox are <1 minute
"_experience": {
"analytics": {
"contextData": {
"key": "Sample value"
},
"customDimensions": {
"eVars": {
"eVar1": "Sample value",
"eVar10": "Sample value",
"eVar100": "Sample value",
"eVar101": "Sample value",
"eVar102": "Sample value",
// ... (other eVars omitted for brevity)
"eVar159": "Sample value"
}
}
}
eVars expect strings and cannot be altered:
window.adobeDataLayer = window.adobeDataLayer || [];
window.adobeDataLayer.push({
"event": "eVar1Update",
"_experience": {
"analytics": {
"customDimensions": {
"eVars": {
"eVar1": "Example eVar1 value" // this is CORRECT
}
}
}
}
});
}
window.adobeDataLayer = window.adobeDataLayer || [];
window.adobeDataLayer.push({
"event": "eVar1Update",
"_experience": {
"analytics": {
"customDimensions": {
"eVars": {
"eVar1": 1234 // This will FAIL
}
}
}
}
});
events expect integers, and are locked…
{
"_experience": {
"analytics": {
"event101to200": {
"event101": {
"id": "Sample value",
"value": 813.52
},
"event102": {
"id": "Sample value",
"value": 25240.48
},
"event103": {
"id": "Sample value",
"value": 6329.62
},
"event104": {
"id": "Sample value",
"value": 5819.11
},
"event105": {
"id": "Sample value",
"value": 5811.46
},
// ... (other events omitted for brevity)
"event135": {
"id": "Sample value",
"value": 28647.86
}
}
}
}
}
Testing Methodology
Workflow:
- Create a new schema
- Create a new dataset from that schema
- After schema alteration, download the sample file
- Deliberately break the sample file and upload:
Note the odd UI, enable Error diagnostics then move upward to select the file. Press 1 then upward to 2, as you will need error diagnostics.
A successful file ingest will look like this:
If you don’t enable error diagnostics, you will get:
If successful, the API will show you the batch file location:
https://platform.adobe.io/data/foundation/catalog/batches/:BATCH_ID
{
"01J8J0H2YJDH794W7RE7R5T51T": {
"status": "success",
"tags": {
"acp_stagePath": [
"acp_foundation_push/stage/01J8J0H2YJDH794W7RE7R5T51T"
],
"acp_sloPolicyName": [
"live10Mb"
],
"aep/siphon/partitions/paths": [],
"acp_finalized_time": [
"1727181410455"
],
"acp_workflow": [
"ValveWorkflow"
],
"numberOfDSFs": [
"0"
],
"acp_requestType": [
"user"
],
"acp_enableErrorDiagnostics": [
"true"
],
"acp_latencyTargetInMillis": [
"300000"
],
"acp_dataSetViewId": [
"66f2b241a985222aedc14c00"
],
"acp_type": [
"ingest"
],
"siphon/valve/stage/ingest": [
"{\"id\":\"2d7555521a224148bfd1db94df326f25\",\"status\":\"created\",\"createdAt\":1727181396912,\"batchId\":\"01J8J0H2YJDH794W7RE7R5T51T\",\"imsOrg\":\"988D095F54BD18520A4C98A5@AdobeOrg\",\"bulkHead\":\"live\",\"service\":\"platform.siphon.ingest\",\"properties\":{}}"
],
"siphon/valve/ingest/status": [
"{\"id\":\"2d7555521a224148bfd1db94df326f25\",\"status\":\"finished\",\"createdAt\":1727181408642,\"batchId\":\"01J8J0H2YJDH794W7RE7R5T51T\",\"imsOrg\":\"988D095F54BD18520A4C98A5@AdobeOrg\",\"bulkHead\":\"live\",\"output\":\"/acp_foundation_push/stage/01J8J0H2YJDH794W7RE7R5T51T-staged/attempt-01J8J0HH9H8WN4RBJ10D0NCS23\",\"sandbox\":{\"sandboxId\":\"84b47ce0-4d40-4137-b47c-e04d40b137ea\",\"sandboxName\":\"inspire-ucp\"},\"properties\":{\"tableCommit\":{\"masterSnapshotId\":\"1\"}}}"
],
"acp_bulkHead": [
"live"
],
"acp_producer": [
"exc_app",
"aep/siphon/bi/uploadMode::"
],
"acp_requestId": [
"qLeHBnphtczpTmRtkA8W1nbIpfsZbuyD"
],
"acp_finalized": [
"finalized"
],
"acp_buffered": [
"false"
],
"acp_latencyMaxInMillis": [
"10800000"
]
},
"relatedObjects": [
{
"type": "dataSet",
"id": "66f2b241a985222aedc14bff"
}
],
"id": "01J8J0H2YJDH794W7RE7R5T51T",
"externalId": "01J8J0H2YJDH794W7RE7R5T51T",
"inputFormat": {
"format": "json",
"isMultiLineJson": false
},
"imsOrg": "988D095F54BD18520A4C98A5@AdobeOrg",
"sandboxId": "84b47ce0-4d40-4137-b47c-e04d40b137ea",
"createdUser": "40770A6762E59BEF0A495ED8@3ce86b6a62e59be9495ef9.e",
"started": 1727181395892,
"metrics": {
"failedRecordCount": 0,
"partitionCount": 0,
"outputByteSize": 8218,
"inputFileCount": 1,
"inputByteSize": 349,
"outputRecordCount": 1,
"outputFileCount": 1,
"inputRecordCount": 1
},
"completed": 1727181409894,
"errors": [],
"created": 1727181393108,
"createdClient": "acp_foundation_push",
"updatedUser": "acp_foundation_dataTracker@AdobeID",
"updated": 1727181410468,
"version": "1.0.6"
}
}
If you didn't enable diagnostics , The API will not show the ndjson (newline delimited json) path:
{
"data": [
{
"name": "66f2b241a985222aedc14c00",
"length": "0",
"_links": {
"self": {
"href": "https://platform.adobe.io:443/data/foundation/export/batches/01J8J324F5MFTHE5JX48P4QM5N/failed?path=66f2b241a985222aedc14c00" // The NDjson path is missing
}
}
}
],
"_page": {
"limit": 100,
"count": 1
}
}
Uploading the sample file with deliberately bad data will generate an error.
{
"_clientnamespace":
{
"documentation test - 23 Sept 2024 - 12:42 GMT": "Sample value", // bad
"test2": 9540,
"test3": false,
"test4":
{
"key": "Sample value"
}
},
"_id": "/uri-reference",
"eventMergeId": "Sample value",
"eventType": "advertising.completes",
"identityMap":
{
"key":
[
{
"authenticatedState": "ambiguous",
"id": "Sample value",
"primary": false
}
]
},
"producedBy": "self",
"timestamp": "2018-11-12T20:20:39+00:00"
}
The field: _clientnamespace. documentation test — 23 Sept 2024–12:42 GMT does not exist in the target schema. Please remove the field and try again.
Testing results & many real-world failure examples:
By looking at failed batches:
Error:
_errors": {
"_streamingValidation": [
{
"path": "#/_dow/product",
"code": "DCVS-1104-400",
"message": "The message cannot be validated due to the data type error: #/_dow/product: expected type: JSONArray, found: JSONObject.",
"errorType": 0
}
]
}
Payload:
"_dow": {
"product": {
"pdpBannerImage": "/content/dam/images/public-images/web/pdp/pdppgheader/dow_54812013514-pdppgbanner-red-molecules-2000x160.jpg",
"description": "A general purpose rigid foam silicone surfactant surfactant for production of rigid polyurethane foam.",
"products": [
{
"businessGroup": [
"CONSUMER SOLUTIONS"
],
"business": [
{
"businessID": "DCS",
"businessName": "Consumer Solutions"
}
],
"SKU": "497615z"
}
]
}
},
Bug: The data snippet you’ve provided is a JSON object, not an array.
Here’s why:
- It uses curly braces
{}
at the outermost level, which is the syntax for a JSON object. - The structure contains key-value pairs, where each key (e.g.,
"_dow"
,"product"
,"description"
) is associated with a value. This is characteristic of an object.
In contrast, a JSON array would be enclosed in square brackets []
and consist of a list of values.
.. ah ha!!!
Error:
"_errors": {
"_streamingValidation": [
{
"path": "#/_experience/analytics/event1to100/event80",
"code": "DCVS-1106-400",
"message": "The message cannot be validated because a required property is missing: #/_experience/analytics/event1to100/event80: required key [value] not found.",
"errorType": 0
}
]
"analytics": {
"event1to100": {
"event80": {
"id": "1"
}
Bug: XDM doesn't have value specified. It looks like somebody filled out the XDM incorrectly.
error:
{
"path": "#/commerce/order/priceTotal",
"code": "DCVS-1104-400",
"message": "The message cannot be validated due to the data type error: #/commerce/order/priceTotal: expected type: Number, found: String.",
"errorType": 0
},
Payload:
"order": {
"priceTotal": "62251.2",
"purchaseID": "116676690",
"purchaseOrderNumber": "S100264-9",
"currencyCode": "USD"
}
Should be “priceTotal”: 62251.2,
errors:
"_errors": {
"_streamingValidation": [
{
"path": "#/productListItems/0",
"code": "DCVS-1106-400",
"message": "The message cannot be validated because a required property is missing: #/productListItems/0: required key [SKU] not found.",
"errorType": 0
}
]
}
},
payload :
"productListItems": [
{
"productAddMethod": "Quick Order",
"name": "PE LDPE 203M BG6025 KG"
}
],
should include SKU field
error:
"order": {
"priceTotal": "62251.2",
"purchaseID": "1142376709",
"purchaseOrderNumber": "S100264-12",
"currencyCode": "BR$"
}
Regex check failed, should be BRL for Brazilian Real , commg off the back end badly
Strict adherence to schema design is necessary
The schema design in Adobe Experience Platform (AEP) is rigid due to the structured nature of XDM (Experience Data Model) schemas. Data flowing from sources like Adobe Launch to the XDM Experience Event schema must be carefully mapped and validated to prevent errors, particularly around data types.
Here's what to pay attention to:
- Data Type Mismatch: Ensure the data elements created in Adobe Launch match the data types expected in the XDM schema. For example, if an XDM field expects a string, but Launch passes an integer or object, it can cause ingestion errors or data loss.
- XDM Validation: The data being ingested must conform strictly to the defined schema. During ingestion, AEP validates the data against the XDM schema, rejecting any records that don’t meet the format.
- Mapping Data Elements to XDM: When mapping data elements to XDM fields in Adobe Launch, make sure you’re using the correct data transformations. Data elements need to be properly processed to match the format expected in the XDM schema (e.g., converting timestamps to the correct format, and handling null values).
- Custom XDM Fields: Be mindful of how custom XDM fields are used. Schema extensions can add complexity and increase the chance of errors, especially if they’re not consistently mapped or validated during data collection.
- Error Logging: Set up robust error logging at each stage — from Adobe Launch to AEP ingestion — to capture and troubleshoot any issues that arise due to data type mismatches or schema violations.
Immutable changes: Enable for profile & breaking changes
WARNING: Immutable Decisions x 2 incoming!
Once a schema is enabled for a profile, it cannot be disabled or deleted, and fields cannot be removed from the schema. These are TWO immutable decisions!
Breaking changes can be made to a schema as long as it has never been used in the creation of a dataset or enabled for use in Real-Time Customer Profile. Once a schema has been used in dataset creation or enabled for use with Real-Time Customer Profile, the rules of Schema Evolution become strictly enforced by the system.
It is possible to architect yourself into a corner here and create a data governance mess. Having bolted bits onto your schema, you have few options left after making the immutable changes above.
Options:
- Nuke the dataset, losing all historical data
- Change the friendly name to include ‘deprecated’’,
Attempting to make changes via the API is no different, the API will return a ‘no breaking changes’
QA Processes need to alter
Do not allow data to flow into a Production dataset, test thoroughly in the lower environment making sure that all journies are completed. A tool like Observepoint may assist, however most companies suffice with Selenium.
Official Platform Tools for Monitoring & identifying
The tooling inside AEP is insufficient. It does not give you the tools to get an overall view of ingestion error quantity. , only a basic interface.
UI Augmentation of this screen is required:
- Monitoring Dashboard: AEP offers only a basic monitoring interface in the UI where users can view streaming end-to-end data ingestion and batch end-to-end data ingestion.
- Failed Records Details: For failed records, AEP provides information such as the number of records ingested, file size, and ingestion start/end times. It also offers details on the specific errors that occurred during processing.
It is necessary to sift through all the batches looking for errors, however, when an error is found there is no way to tell how prevalent that error is.
Monitoring and Troubleshooting — via API / Postman
https://platform.adobe.io/data/foundation/export/batches/:batchId/failed
Traverse up the batch batch we see:
This is coming from the backend. USD, GBP, EUR etc are fine. Whilst these could be fixed in the data element, this isn't the place to fix these mutations.
BR$ was also seen.
Root Causes of ingestion errors stem from schema design, poor planning & activation processes
Schema design needs to be thought through.
A “surveillance script” I knocked together in Python / AEP Jupyter Workbench
We needed to prioritize … sifting through the limited UI and yelling ‘I found another one’ rapidly became an endless game of whack-a-mole.
I needed a way to get a helicopter view of all errors so as to allow product owners to prioritize fixes. Whilst all the errors needed to be fixed eventually for confidence, I was already past 8 work items.
For now, all that mattered was Use Case prioritisation. Was a lack of datafill stopping the personalisation works flow from activating usecase due to lack of data flow?
This is exactly what I needed !! Something which allowed me to create & prioritize fixes!
This was done in Python (run inside AEP’s Jupyter Notebook environment.
Part 1 — Pull all the faulty batches < 24hrs and list
Note: the API Parameters caused some difficulty.
Part 2 — Download all the faulty batches identified in Part 1
Part 3— Sift through all the faulty batches, extracting the DCVS error codes
As implementation best practice, I always populate an evar with the rule name and function, eg ‘productView’, this allows me to navigate to the Launch component that needs attention.
Part 4— Pivot , count , display
Is the data I need present for my Use Case? — datafill metrics
Against the background of ingestion errors, one may need to check that the data points needed to stand up a use case are there.
For example:
- Use case requires SKU to be there on all product views, for the activation to occur.
Lets create a metric :
( SKU is not Null)
— — — — — — — — x 100%
ProductViews
This will allow us to move forward with confidence, knowing that the data needed is there
SELECT
DATE("timestamp") AS event_date,
-- Count and percentage for SKU
COUNT(CASE
WHEN _clientname.product.SKU IS NOT NULL
AND CARDINALITY(_clientname.product.SKU) > 0
THEN 1
END) AS sku_count,
-- Count and percentage for businessGroup
COUNT(CASE
WHEN _clientname.product.businessGroup IS NOT NULL
AND CARDINALITY(_clientname.product.businessGroup) > 0
THEN 1
END) AS business_group_count,
-- Count and percentage for description
COUNT(CASE
WHEN _clientname.product.description IS NOT NULL
AND CARDINALITY(_clientname.product.description) > 0
THEN 1
END) AS description_count,
-- Count and percentage for pdpBannerImage
COUNT(CASE
WHEN _clientname.product.pdpBannerImage IS NOT NULL
AND CARDINALITY(_clientname.product.pdpBannerImage) > 0
THEN 1
END) AS pdp_banner_image_count,
COUNT(*) AS total_product_views,
-- SKU percentage with a '%' symbol
CONCAT(
ROUND(
(COUNT(CASE
WHEN _clientname.product.SKU IS NOT NULL
AND CARDINALITY(_clientname.product.SKU) > 0
THEN 1
END)::DECIMAL / COUNT(*) * 100), 1
), '%'
) AS sku_percentage,
-- businessGroup percentage with a '%' symbol
CONCAT(
ROUND(
(COUNT(CASE
WHEN _clientname.product.businessGroup IS NOT NULL
AND CARDINALITY(_clientname.product.businessGroup) > 0
THEN 1
END)::DECIMAL / COUNT(*) * 100), 1
), '%'
) AS business_group_percentage,
-- description percentage with a '%' symbol
CONCAT(
ROUND(
(COUNT(CASE
WHEN _clientname.product.description IS NOT NULL
AND CARDINALITY(_clientname.product.description) > 0
THEN 1
END)::DECIMAL / COUNT(*) * 100), 1
), '%'
) AS description_percentage,
-- pdpBannerImage percentage with a '%' symbol
CONCAT(
ROUND(
(COUNT(CASE
WHEN _clientname.product.pdpBannerImage IS NOT NULL
AND CARDINALITY(_clientname.product.pdpBannerImage) > 0
THEN 1
END)::DECIMAL / COUNT(*) * 100), 1
), '%'
) AS pdp_banner_image_percentage
FROM
public.prod_dataset
WHERE
eventType = 'productView'
AND DATE("timestamp") >= CURRENT_DATE - INTERVAL '60' DAY
GROUP BY
event_date
ORDER BY
event_date DESC;
Output:
Softcrylic Datafacts — like Observepoint but for CDP?
Worthy of note is Softcrylics tool, very little detail is available but the screenshot suggests it is something built from the API’s above via some sort of react framework
See:
DataFacts is an AI-based Digital Data Assurance solution that helps ensure the accuracy and consistency of your customer data used for marketing and sales. DataFacts saves your team the time of assessing errors and issues in your data and saves your company the unquantifiable cost of your data having errors. The more DataFacts churns through your data, the more intelligent it becomes about your data making it more accurate and precise in its findings.
These services help maximize the reliability and efficiency of data in AEP.