Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-420 Topic 10 Question 2 Discussion

Actual exam question for Microsoft's DP-420 exam
Question #: 2
Topic #: 10
[All DP-420 Questions]

You need to configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account. The data from a container named telemetry must be added to a Kafka topic named iot. The solution must store the data in a compact binary format.

Which three configuration items should you include in the solution? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer: C, D, F

C: Avro is binary format, while JSON is text.

F: Kafka Connect for Azure Cosmos DB is a connector to read from and write data to Azure Cosmos DB. The Azure Cosmos DB sink connector allows you to export data from Apache Kafka topics to an Azure Cosmos DB database. The connector polls data from Kafka to write to containers in the database based on the topics subscription.

D: Create the Azure Cosmos DB sink connector in Kafka Connect. The following JSON body defines config for the sink connector.

Extract:

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter'

'connect.cosmos.containers.topicmap': 'hotels#kafka'

Incorrect Answers:

B: JSON is plain text.

Note, full example:

{

'name': 'cosmosdb-sink-connector',

'config': {

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'tasks.max': '1',

'topics': [

'hotels'

],

'value.converter': 'org.apache.kafka.connect.json.AvroConverter',

'value.converter.schemas.enable': 'false',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter',

'key.converter.schemas.enable': 'false',

'connect.cosmos.connection.endpoint': 'Error! Hyperlink reference not valid.',

'connect.cosmos.master.key': '<cosmosdbprimarykey>',

'connect.cosmos.databasename': 'kafkaconnect',

'connect.cosmos.containers.topicmap': 'hotels#kafka'

}

}


https://docs.microsoft.com/en-us/azure/cosmos-db/sql/kafka-connector-sink

https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/

Contribute your Thoughts:

Currently there are no comments in this discussion, be the first to comment!


Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77