1. Confluent Documentation, "Apache Kafka Use Cases": This official resource explicitly lists the correct answers as primary use cases. It states, "Kafka is used for real-time fraud detection," "Real-time recommendations...by capturing and analyzing user behavior in real time," and "Log analysis and monitoring...Kafka can be used to collect and process log data in real time."
Source: Confluent Documentation, "Learn Apache Kafka", Section: "Apache Kafka Use Cases".
2. Narkhede, N., Shapira, G., & Palino, T. (2021). Kafka: The Definitive Guide (2nd ed.). O'Reilly Media. Chapter 1, "Meet Kafka," introduces the core drivers for Kafka's creation and its common applications. It details "Website Activity Tracking" (the basis for recommendations), "Log Aggregation" (for monitoring), and real-time processing of event data, which is fundamental to fraud detection.
Source: Chapter 1, Sections: "Website Activity Tracking" and "Log Aggregation".
3. Kleppmann, M. (2017). Designing Data-Intensive Applications. O'Reilly Media. Chapter 11, "Stream Processing," contrasts batch processing with stream processing. It describes use cases like "Complex event processing (CEP)," which includes fraud detection and monitoring systems, as prime examples of stream processing where timely responses to event patterns are critical. End-of-day jobs are used as a counterexample of batch processing.
Source: Chapter 11, Section: "Stream Processing", Subsection: "Applications of Stream Processing".