Extract the technical specifications from the text below in JSON format.
Google Pixel 7, 5G network, 8GB RAM, Tensor G2 processor, 128GB of storage, Lemongrass
Extract the technical specifications from the text below in a JSON format.
<EXAMPLE>
INPUT: Google Nest Wifi, network speed up to 1200Mpbs, 2.4GHz and 5GHz frequencies, WP3 protocol
OUTPUT:
{
"product":"Google Nest Wifi",
"speed":"1200Mpbs",
"frequencies": ["2.4GHz", "5GHz"],
"protocol":"WP3"
}
</EXAMPLE>
Google Pixel 7, 5G network, 8GB RAM, Tensor G2 processor, 128GB of storage, Lemongrass
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-28。"],[],[],null,["# Include few-shot examples\n\nYou can include examples in the prompt that show the model what a good response looks like. The\nmodel attempts to identify patterns and relationships from the examples and applies them when\ngenerating a response. Prompts that contain examples are called *few-shot* prompts, while\nprompts that provide no examples are called *zero-shot prompts*. Few-shot prompts are\noften used to regulate the output formatting, phrasing, scoping, or general patterning of model\nresponses. Use specific and varied examples to help the model narrow its focus and generate more\naccurate results.\n\nIncluding few-shot examples in your prompts helps make them more reliable and effective.\nHowever, you should always accompany few-shot examples with clear instructions. Without clear\ninstructions, models might pick up one unintended patterns or relationships from the examples, which\ncan lead to poor results.\n\nThe key points to this strategy are as follows:\n\n- Including prompt-response examples in the prompt helps the model learn how to respond.\n- Use XML-like markup to markup the examples.\n- Experiment with the number of prompts to include. Depending on the model, too few examples are ineffective at changing model behavior. Too many examples can cause the model to overfit.\n- Use consistent formatting across examples\n\nZero-shot versus few-shot prompts\n---------------------------------\n\nThe following zero-shot prompt asks the model to extract the technical specifications from\ntext and output it in JSON format:\n\nSuppose that your use case requires specific formatting, such as using lowercase key names. You can include\nexamples in the prompt that shows the model how to format the JSON. The following few-shot prompt\ndemonstrates an output format where the JSON keys are lowercase:\n\nNote that the example uses XML-like formatting to separate the components of the prompt. To\nlearn more about how to optimally format few-shot prompts using XML-like formatting, see\n[Structure prompts](/vertex-ai/generative-ai/docs/learn/prompts/structure-prompts).\n\nFind the optimal number of examples\n-----------------------------------\n\nYou can experiment with the number of examples to provide in the prompt for the most desired\nresults. Models like Gemini can often pick up on patterns using a few examples, though you\nmay need to experiment with what number of examples leads to the desired results. At the same time,\nif you include too many examples, the model might start to\n[overfit](https://developers.google.com/machine-learning/glossary#overfitting)\nthe response to the examples.\n\nWhat's next\n-----------\n\n- Explore more examples of prompts in the [Prompt gallery](/vertex-ai/generative-ai/docs/prompt-gallery)."]]