we have a service which is generating 7.2
B events in 30 mins Duration. Need to store these events in Azure Data Explorer. To send the data we are using Event-Hub since we can stream the data in real-time using the Event-Hubs. We are not able to send these many records to the event hub through
AMQP and event hub client SDK. Is there any way to push huge records in nominal Duration(~30mins).
We are able to push 2.5 Million Events
to HUB in batches(Each batch contains 400 Events) with the following code:
public static async Task EventHub(string processors
{
var producerOptions = new EventHubProducerClientOptions();
producerOptions.ConnectionOptions.TransportType = EventHubsTransportType.AmqpWebSockets;
var producer = new EventHubProducerClient(connectionstring, hubName, producerOptions);
using EventDataBatch eventBatch = await producer.CreateBatchAsync();
_ = eventBatch.TryAdd(new EventData(Encoding.UTF8.GetBytes(processLogs)));
await producer.SendAsync(eventBatch);
}
and the event data follows below:
{
"Event_End_Timestamp":
"2020-11-06T06:00:01.8700000Z",
"Event_Duration":
0.163,
"Scrap":
5.0,
"OperationCycle":
0.19,
"WorkOrder_ID":
"123450",
"Sku":
"98764",
"Sequence":
"10",
"Process_ID":
"H200",
"Machine_ID":
"M1503",
"Parent_Event_ID":
10000002,
"Event_ID":
10000000,
"Event_Type_Counter":
10,
"Event_Type":
"OpercationCycle",
"Event_Start_Timestamp":
"2020-11-06T06:00:01.7060000Z"
}