Q: 13
A company wants to use large language models (LLMs) with Amazon Bedrock to develop a chat
interface for the company's product manuals. The manuals are stored as PDF files.
Which solution meets these requirements MOST cost-effectively?
Options
Discussion
C not D. Official guide talks about fine-tuning models with custom data, worth double-checking sample labs.
D imo. Knowledge base with Bedrock is made for this, plus you only pay for relevant context per prompt. Cheaper than fine-tuning or jamming all PDFs in each query. Pretty sure D is right, but open to pushback if I missed something.
Its D here. Uploading the PDFs to a Bedrock knowledge base lets you pull in just what you need per prompt, which saves way more on token costs than fine-tuning or shoving all docs in context. Makes sense unless I missed something?
A is wrong, D. Knowledge base with Bedrock is much more cost effective since you only pass relevant info per query.
A tbh. I figured adding one PDF as context would keep costs low since you’re not uploading everything at once. It seems scalable if the user only asks about one manual at a time, but maybe I’m missing something obvious?
Be respectful. No spam.