I have a .NET 8 background service responsible for two primary tasks:
1. Push JSON data packets to message brokers (ActiveMQ, RabbitMQ, or JMS queues):
- Sends approximately 25,000 JSON data packets every 2 minutes (data size can vary from 25,000 to 20,000,000).
- This process is handled asynchronously.
2. Manage multiple services using this push mechanism:
-
There are 21 different services running concurrently, all working in an asynchronous manner and in single database.
-
Each service has a 10-minute sleep interval after sending a packet.
New Requirement:
I now need to implement auditing for every data packet pushed to the broker. Specifically, I need to log each record in a SQL database with a timestamp indicating when it was sent.
Challenges:
- The volume of data being processed is significant, and downtime for the SQL database is not acceptable.
- The solution needs to handle high throughput efficiently without becoming a bottleneck to queue push operations.
Approaches I’ve considered so far:
- Implementing a simple fire-and-forget service using C#’s asynchronous capabilities to log the data to SQL. However, I am unsure if this can handle the load effectively.
- Using a background job processing library like Hangfire to offload the auditing to a separate worker process.
I haven’t finalized an approach yet and am open to other ideas or improvements. Any suggestions or insights would be greatly appreciated.