🍎
Comprehensive Technical Tutorial for AEP
  • Comprehensive Technical Tutorial for Adobe Experience Platform
    • Architecture
    • Video Overview
  • 0 - Getting started
    • 0.0 Which environment do I use?
    • (Deprecated) Install the Chrome extension for the Experience League documentation
    • 0.1 Use Demo System Next to setup your Adobe Experience Platform Data Collection client property
    • 0.2 Create your Datastream
    • 0.3 Set up the website
    • 0.4 Set up the mobile app
    • 0.5 Ingest Data to AEP through the Website
    • 0.6 Ingest Data to AEP through the Mobile App
    • 0.7 Visualize your own Real-time Customer Profile - UI
    • 0.8 See your Real-time Customer Profile in action in the Call Center
    • 0.9 Set up and use the AEP API to visualize your Real-Time Customer Profile
    • 0.10 Install the Experience Platform Debugger Extension
    • 0.11 What if I want to demonstrate basic AEP concepts directly on a live website?
  • 1 - Adobe Experience Platform Data Collection and the Web SDK extension
    • 1.1 Understanding Adobe Experience Platform Data Collection
    • 1.2 Edge Network, Datastreams and Server Side Data Collection
    • 1.3 Introduction to Adobe Experience Platform Data Collection
    • 1.4 Client-side Web Data Collection
    • 1.5 Implement Adobe Analytics and Adobe Audience Manager
    • 1.6 Implement Adobe Target
    • 1.7 XDM Schema requirements in Adobe Experience Platform
    • Summary and Benefits
  • 2 - Data Ingestion
    • 2.1 Explore the Website
    • 2.2 Configure Schemas and Set Identifiers
    • 2.3 Configure Datasets
    • 2.4 Data Ingestion from Offline Sources
    • 2.5 Data Landing Zone
    • Summary and Benefits
  • 3 - Real-time Customer Profile
    • 3.1 Visit the website
    • 3.2 Visualize your own real-time customer profile - UI
    • 3.3 Visualize your own real-time customer profile - API
    • 3.4 Create a segment - UI
    • 3.5 Create a segment - API
    • 3.6 See your Real-time Customer Profile in action in the Call Center
    • Summary and benefits
  • 4 - Query Service
    • 4.0 Prerequisites
    • 4.1 Getting Started
    • 4.2 Using the Query Service
    • 4.3 Queries, queries, queries... and churn analysis
    • 4.4 Generate a dataset from a query
    • 4.5 Query Service and Power BI
    • 4.6 Query Service and Tableau
    • 4.7 Query Service API
    • Summary and benefits
  • 5 - Intelligent Services
    • 5.1 Customer AI - Data Preparation (Ingest)
    • 5.2 Customer AI - Create a New Instance (Configure)
    • 5.3 Customer AI - Scoring Dashboard and Segmentation (Predict & Take Action)
  • 6 - Real-time CDP - Build a segment and take action
    • 6.1 Create a segment
    • 6.2 Review how to configure DV360 Destination using Destinations
    • 6.3 Take Action: send your segment to DV360
    • 6.4 Take Action: send your segment to an S3-destination
    • 6.5 Take Action: send your segment to Adobe Target
    • 6.6 External Audiences
    • 6.7 Destinations SDK
    • Summary and benefits
  • 7 - Adobe Journey Optimizer: Orchestration
    • 7.1 Create your event
    • 7.2 Create your journey and email message
    • 7.3 Update your Data Collection property and test your journey
    • Summary and benefits
  • 8 - Adobe Journey Optimizer: External data sources and custom actions
    • 8.1 Define an event
    • 8.2 Define an external data source
    • 8.3 Define a custom action
    • 8.4 Create your journey and messages
    • 8.5 Trigger your journey
    • Summary and benefits
  • 9 - Adobe Journey Optimizer: Offer Decisioning
    • 9.1 Offer Decisioning 101
    • 9.2 Configure your offers and decision
    • 9.3 Prepare your Data Collection Client property and Web SDK setup for Offer Decisioning
    • 9.4 Combine Adobe Target and Offer Decisioning
    • 9.5 Use your decision in an email
    • 9.6 Test your decision using the API
    • Summary and benefits
  • 10 - Adobe Journey Optimizer: Event-based Journeys
    • 10.1 Configure an event-based journey - Order Confirmation
    • 10.2 Configure a batch-based newsletter journey
    • 10.3 Apply personalization in an email message
    • 10.4 Setup and use push notifications
    • 10.5 Create a business event journey
    • Summary and benefits
  • 11 - Customer Journey Analytics - Build a dashboard using Analysis Workspace on top of Adobe Experie
    • 11.1 Customer Journey Analytics 101
    • 11.2 Connect Adobe Experience Platform Data Sets in Customer Journey Analytics
    • 11.3 Create a Data View
    • 11.4 Data Preparation in Customer Journey Analytics
    • 11.5 Visualization using Customer Journey Analytics
    • Summary and benefits
  • 12 - Ingest & Analyze Google Analytics data in Adobe Experience Platform with the BigQuery Source Co
    • 12.1 Create your Google Cloud Platform Account
    • 12.2 Create your first query in BigQuery
    • 12.3 Connect GCP & BigQuery to Adobe Experience Platform
    • 12.4 Load data from BigQuery into Adobe Experience Platform
    • 12.5 Analyze Google Analytics Data using Customer Journey Analytics
    • Summary and benefits
  • 13 - Real-Time CDP: Segment Activation to Microsoft Azure Event Hub
    • 13.1 Configure your Microsoft Azure EventHub environment
    • 13.2 Configure your Azure Event Hub Destination in Adobe Experience Platform
    • 13.3 Create a segment
    • 13.4 Activate segment
    • 13.5 Create your Microsoft Azure Project
    • 13.6 End-to-end scenario
    • Summary and benefits
  • 14 - Real-Time CDP Connections: Event Forwarding
    • 14.1 Create a Data Collection Event Forwarding property
    • 14.2 Update your Datastream to make data available to your Data Collection Event Forwarding property
    • 14.3 Create and configure a custom webhook
    • 14.4 Create and configure a Google Cloud Function
    • 14.5 Forward events towards the AWS ecosystem
    • Summary and benefits
  • 15 - Stream data from Apache Kafka into Adobe Experience Platform
    • 15.1 Introduction to Apache Kafka
    • 15.2 Install and configure your Kafka cluster
    • 15.3 Configure HTTP API Streaming endpoint in Adobe Experience Platform
    • 15.4 Install and configure Kafka Connect and the Adobe Experience Platform Sink Connector
    • Summary and benefits
Powered by GitBook
On this page
  • Good to know
  • 14.5.1 Configure your AWS S3 bucket
  • 14.5.2 Configure your AWS Kinesis Data Stream
  • 14.5.3 Configure your AWS Firehose Delivery Stream
  • 14.5.4 Configure your AWS IAM Role
  • 14.5.5 Configure your AWS API Gateway
  • 14.5.6 Update your Event Forwarding property
  • 14.5.6.1 Update your Event Forwarding property: Create a Data Element
  • 14.5.6.2 Update your Adobe Experience Platform Data Collection Server property: Update your Rule
  • 14.5.7 Test your configuration
  1. 14 - Real-Time CDP Connections: Event Forwarding

14.5 Forward events towards the AWS ecosystem

Forward events towards the AWS ecosystem

Previous14.4 Create and configure a Google Cloud FunctionNextSummary and benefits

Last updated 2 years ago

Completion of this exercise is optional and a cost is involved to use AWS Kinesis. While AWS provides a free tier account which lets you test and configure many services without a cost, AWS Kinesis isn't part of that free tier account. So in order to implement and test this exercise, a cost will be involved to use AWS Kinesis.

Good to know

Adobe Experience Platform supports various Amazon services as destination. Kinesis and S3 are both profile export destinations and can be used as part of Adobe Experience Platform's Real-Time CDP. You can easily feed high-value segment events and associated profile attributes into your systems of choice.

In this note, you’ll learn how to setup your own Amazon Kinesis stream to stream event data coming from the Adobe Experience Platform Edge ecosystem to a cloud storage destination, such as Amazon S3. This is useful in case you'd like to collect experience events from web and mobile properties and push them into your datalake for analysis and operational reporting. Datalakes generally ingest data in a batch fashion with large daily file imports, they do not expose public http endpoint which could be used in conjunction with event forwarding.

Supporting the above use cases imply that streamed data need to be buffered or placed in a queue before being written to a file. Care has to be taken to not open file for write access across multiple process. Delegating this task to dedicated system is ideal to scale nicely while ensuring a great level of service, this is where Kinesis comes to the rescue.

Amazon Kinesis Data Streams focuses on ingesting and storing data streams. Kinesis Data Firehose focuses on delivering data streams to select destinations, such as S3 buckets.

As part of this exercise, you'll...

  • Perform a basic setup of a Kinesis data stream

  • Create a Firehose delivery stream and use S3 bucket as destination

  • Configure Amazon API gateway as a rest api endpoint to receive your event data

  • Forward raw event data from Adobe's Edge to your Kinesis stream

14.5.1 Configure your AWS S3 bucket

Go to https://console.aws.amazon.com and sign in with the Amazon-account you previously created.

After logging in, you'll be redirected to the AWS Management Console.

In the Find Services menu, search for s3. Click the first search result: S3 - Scalable Storage in the Cloud.

You'll then see the Amazon S3 homepage. Click Create Bucket.

In the Create Bucket screen, you need to configure two things:

  • Name: use the name eventforwarding---demoProfileLdap--. As an example, in this exercise the bucket name is aepmodulertcdpvangeluw

  • Region: use the region EU (Frankfurt) eu-central-1

Leave all the other default settings as they are. Scroll down and click Create bucket.

You'll then see your bucket being created and will be redirected to the Amazon S3 homepage.

14.5.2 Configure your AWS Kinesis Data Stream

In the Find Services menu, search for kinesis. Click the first search result: Kinesis - Work with Real-Time Streaming Data.

Select Kinesis Data Streams. Click Create data stream.

For the Data stream name, use --demoProfileLdap---datastream.

There's no need to change any of the other settings. Scroll down and click Create data stream.

You'll then see this. Once your data stream is succesfully created, you can move forward to the next exercise.

14.5.3 Configure your AWS Firehose Delivery Stream

In the Find Services menu, search for kinesis. Click Kinesis Data Firehose.

Click Create delivery stream.

For Source, select Amazon Kinesis Data Streams. For Destination, select Amazon S3. Click Browse to select your data stream.

Select your data stream. Click Choose.

You'll then see this. Remember the Delivery stream name as you'll need it later.

Scroll down until you see Destination Settings. Click Browse to select your S3 bucket.

Select your S3 bucket and click Choose.

You'll then see something like this. Update the following settings:

  • Dynamic partitioning: set to Enabled

  • Multi record deaggregation: set to Disabled

  • New line delimiter: set to Enabled

  • Inline parsing for JSON: set to Enabled

Scroll down a bit, you'll then see this. Update the following settings:

  • Dynamic partitionning keys

    • Key name: dynamicPartitioningKey

    • JQ expression: .dynamicPartitioningKey

  • S3 bucket prefix: add the following code:

!{partitionKeyFromQuery:dynamicPartitioningKey}/!{timestamp:yyyy}/!{timestamp:MM}/!{timestamp:dd}/!{timestamp:HH}/}
  • S3 bucket error ouput prefix: set to error

Finally, scroll down a bit more and click Create delivery stream

After a couple of minutes, your delivery stream will nbe created and Active.

14.5.4 Configure your AWS IAM Role

In the Find Services menu, search for iam. Click API Gateway.

Click Roles.

Search for your KinesisFirehose role. Click it to open it.

Click your Permissions Policy name to open it.

In the new screen that opens, click Edit Policy.

Under Kinesis - Actions, ensure that the Write permissions for PutRecord is enabled. Click Review Policy.

Click Save Changes.

You'll then be back here. Click Roles.

Search for your KinesisFirehose role. Click it to open it.

Go to Trust relationships and click Edit trust policy.

Overwrite the current trust policy by pasting this code to replace the existing code:

{
	"Version": "2012-10-17",
	"Statement": [
		{
			"Effect": "Allow",
			"Principal": {
				"Service": [
                    "firehose.amazonaws.com",
                    "kinesis.amazonaws.com",
                    "apigateway.amazonaws.com"
                ]
			},
			"Action": "sts:AssumeRole"
		}
	]
}

Click Update policy

You'll then see this. You'll need to specify the ARN for this role in the next step.

14.5.5 Configure your AWS API Gateway

Amazon API Gateway is an AWS service for creating, publishing, maintaining, monitoring, and securing REST, HTTP, and WebSocket APIs at any scale. API developers can create APIs that access AWS or other web services, as well as data stored in the AWS Cloud.

You will now expose the Kinesis data stream to the internet through a HTTPS endpoint which can then directly be consumed by Adobe services, like Event Forwarding.

In the Find Services menu, search for api gateway. Click API Gateway.

You'll then see something like this. Click Create API.

Click Build on the REST API card.

You'll then see this. Fill out the settings like this:

  • Choose the protocol: select REST

  • Create new API: select New API

  • Settings:

    • API name: use --demoProfileLdap---eventforwarding

    • Endpoint Type: select Regional

Click Create API.

You'll then see this. Click Actions and then click Create Resource.

You'll then see this. Set Resource Name to stream. Click Create Resource.

You'll then see this. Click Actions and then click Create Method.

In the dropdown, select POST and click the v button.

You'll then see this. Fill out the settings like this:

  • Integration type: AWS Service

  • AWS Region: select the region that is used by your Kinesis Data Stream, in this case: us-west-2

  • AWS Service: select Kinesis

  • AWS Subdomain: leave empty

  • HTTP Method: select POST

  • Action Type: select Use action name

  • Action: enter PutRecord

  • Execution role: paste the ARN of the execution role that is used by your Kinesis Data Firehose, as instructed in the previous exercise

  • Content Handling: select Passthrough

  • Use Default Timeout: enable the checkbox

Click Save.

You'll then see this. Click Integration Request.

Click HTTP Headers.

Scroll down a bit and click Add header.

Set Name to Content-Type, set Mapped from to 'application/x-amz-json-1.1'. Click the v icon to save your changes.

You'll then see this. For Request body passthrough, select When there are no templates defined (recommended). Next, click Add mapping template.

Under Content-Type, enter application/json. Click the v icon to save your changes.

Scroll down to find a code editor window. Paste the below code in there:

{
  "StreamName": "$input.path('StreamName')",
  "Data": "$util.base64Encode($input.json('$.Data'))",
  "PartitionKey": "$input.path('$.PartitionKey')"
}

Click Save.

Next, scroll up and click <- Method Execution to go back.

Click TEST.

Scroll down, and paste this code under Request Body. Click Test.

{
  "Data": {
    "message": "Hello World",
    "dynamicPartitioningKey": "v2"
  },
  "PartitionKey": "1",
  "StreamName": "--demoProfileLdap---datastream"
}

You'll then see a similar result:

You'll then see this. Click Actions and then click Deploy API.

For Deployment stage, select New Stage. As Stage name, enter prod. Click Deploy.

You'll then see this. Click Save Changes. FYI: the URL in the image is the URL to use to send data towards (in this example: https://vv1i5vwg2k.execute-api.us-west-2.amazonaws.com/prod).

You can test your setup by using the below cURL request, all you need to do is replace the below URL by yours, https://vv1i5vwg2k.execute-api.us-west-2.amazonaws.com/prod in this example, and add /stream at the end of the URL.

curl --location --request POST 'https://vv1i5vwg2k.execute-api.us-west-2.amazonaws.com/prod/stream' \
--header 'Content-Type: application/json' \
--data-raw '{
    "Data": {
        "userid": "--demoProfileLdap--@adobe.com",
        "firstName":"--demoProfileLdap--",
        "offerName":"10% off on outdoor gears",
        "offerCode": "10OFF-SPRING",
        "dynamicPartitioningKey": "campaign"
    },
    "PartitionKey": "1",
    "StreamName": "--demoProfileLdap---datastream"
}'

Paste the above updated code in a Terminal window, and hit enter. You'll then see this response, similar to the response you could see when testing above.

14.5.6 Update your Event Forwarding property

You can now activate to your AWS Kinesis data stream through AWS API Gateway, so you can now send your raw experience events into the AWS ecosystem. Using Real-Time CDP Connections and Event Forwarding, you can now easily enable event forwarding to your newly created AWS API Gateway endpoint.

14.5.6.1 Update your Event Forwarding property: Create a Data Element

Go to https://experience.adobe.com/#/data-collection/ and go to Event Forwarding. Search your Event Forwarding property and click it to open it.

In the left menu, go to Data Elements. Click Add Data Element.

You'll then see a new data element to configure.

Make the following selection:

  • As the Name, enter awsDataObject.

  • As the Extension, select Core.

  • As the Data Element Type, select Custom Code.

You'll now have this. Click </> Open Editor.

In the Editor, paste the following code on line 3. Click Save.

const newObj = {...arc.event.xdm, dynamicPartitioningKey: "event_forwarding"}
return JSON.stringify(newObj);

In the above path, a reference is made to arc. arc stands for Adobe Resource Context and arc always stands for the highest available object that is available in the Server Side context. Enrichments and transformations may be added to that arc object using Adobe Experience Platform Data Collection Server functions.

In the above path, a reference is made to event. event stands for a unique event and Adobe Experience Platform Data Collection Server will always evaluate every event individually. Sometimes, you may see a reference to events in the payload sent by Web SDK Client Side, but in Adobe Experience Platform Data Collection Event Forwarding, every event is evaluated individually.

You'll then be back here. Click Save or Save to Library.

14.5.6.2 Update your Adobe Experience Platform Data Collection Server property: Update your Rule

In the left menu, go to Rules. Click to open the rule All Pages which you created in one of the previous exercises.

You'll then see this. Click the + icon to add a new action.

You'll then see this. Make the following selection:

  • Select the Extension: Adobe Cloud Connector.

  • Select the Action Type: Make Fetch Call.

That should give you this Name: Adobe Cloud Connector - Make Fetch Call. You should now see this:

Next, configure the following:

  • Change the request method from GET to POST

  • Enter the URL of the AWS API Gateway endpoint you created in one of the previous steps, which looks like this: https://vv1i5vwg2k.execute-api.us-west-2.amazonaws.com/prod/stream

You should now have this. Next, go to Headers.

Under headers, add a new header with key Content-Type and value application/json. Next, go to Body.

You'll then see this. Paste the following code in the field Body (Raw). Click Keep Changes.

{
    "Data":{{awsDataObject}},
    "PartitionKey": "1",
    "StreamName": "--demoProfileLdap---datastream"
}

You'll then see be back here. Click Save or Save to Library.

You've now configured your first rule in an Event Forwarding property. Go to Publishing Flow to publish your changes. Open your Development library by clicking Main.

Click the Add All Changed Resources button, after which you'll see your Rule and Data Element changes appear in this library. Next, click Save & Build for Development. Your changes are now being deployed.

After a couple of minutes, you'll see that the deployment is done and ready to be tested.

14.5.7 Test your configuration

Go to https://builder.adobedemo.com/projects. After logging in with your Adobe ID, you'll see this. Click your website project to open it.

You can now follow the below flow to access the website. Click Integrations.

On the Integrations page, you need to select the Data Collection property that was created in exercise 0.1.

You'll then see your demo website open up. Select the URL and copy it to your clipboard.

Open a new incognito browser window.

Paste the URL of your demo website, which you copied in the previous step. You'll then be asked to login using your Adobe ID.

Select your account type and complete the login process.

You'll then see your website loaded in an incognito browser window. For every demonstration, you'll need to use a fresh, incognito browser window to load your demo website URL.

When you open up your browser Developer View, you can inspect Network requests as indicated below. When you use the filter interact, you'll see the network requests that are sent by Adobe Experience Platform Data Collection Client to the Adobe Edge.

If you select the raw payload, go to https://jsonformatter.org/json-pretty-print and paste the payload. Click Make Pretty. You'll then see the JSON payload, the events object and the xdm object. In one of the previous steps, when you defined the Data Element, you used the reference arc.event.xdm, which will result in you parsing out the xdm object of this payload.

Switch your view to AWS. By opening your data stream and going into the Monitoring tab, you'll now see incoming traffic.

When you then open your delivery stream and go into the Monitoring tab, you'll also see incoming traffic.

Finally, when you have a look at your S3 bucket, you'll now notice files being created there as a consequence of your data ingestion.

When you download such a file and open it using a text editor, you'll see that it contains the XDM payload from the events that were forwarded.

ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
ETL
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
Adobe Experience Platform Data Collection SSF
DSN
DSN
DSN
DSN
DSN
DSN
DSN
DSN
Adobe Experience Platform Data Collection Setup
Adobe Experience Platform Data Collection Setup
Adobe Experience Platform Data Collection Setup
Adobe Experience Platform Data Collection Setup
Adobe Experience Platform Data Collection Setup
Adobe Experience Platform Data Collection Setup