Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions ai/ai-react-app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,14 @@ For more information about the Firebase AI SDK, see the [Firebase AI Logic Docs]
yarn dev
```

## Hybrid Mode (On-Device Inference)

This sample supports Hybrid Mode, which allows falling back to on-device inference using Chrome's Prompt API when available.

To use Hybrid Mode:
1. Enable the "Prompt API" in Chrome. See the [Chrome AI Prompt API documentation](https://developer.chrome.com/docs/ai/prompt-api) for instructions on how to enable it and download the required model (Gemini Nano).
2. Toggle "Hybrid Mode" in the right sidebar of the application.

## Support

- [Firebase Support](https://firebase.google.com/support/)
Expand Down
11 changes: 11 additions & 0 deletions ai/ai-react-app/src/components/Layout/MainLayout.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ import {
GoogleAIBackend,
getAI,
ResponseModality,
InferenceMode,
} from "firebase/ai";
import {
AVAILABLE_GENERATIVE_MODELS,
Expand Down Expand Up @@ -51,6 +52,8 @@ const MainLayout: React.FC<MainLayoutProps> = ({
responseModalities: [ResponseModality.TEXT, ResponseModality.IMAGE],
},
});
const [isHybridMode, setIsHybridMode] = useState(false);
const [inferenceMode, setInferenceMode] = useState<InferenceMode>(InferenceMode.PREFER_ON_DEVICE);
const [selectedAspectRatio, setSelectedAspectRatio] = useState<string | undefined>();

const [usageMetadata, setUsageMetadata] = useState<UsageMetadata | null>(
Expand Down Expand Up @@ -111,6 +114,8 @@ const MainLayout: React.FC<MainLayoutProps> = ({
onUsageMetadataChange={setUsageMetadata}
currentParams={generativeParams}
activeMode={activeMode}
isHybridMode={isHybridMode}
inferenceMode={inferenceMode}
/>
);
case "nanobanana":
Expand All @@ -129,6 +134,8 @@ const MainLayout: React.FC<MainLayoutProps> = ({
onUsageMetadataChange={setUsageMetadata}
currentParams={generativeParams}
activeMode={activeMode}
isHybridMode={isHybridMode}
inferenceMode={inferenceMode}
/>
);
}
Expand Down Expand Up @@ -160,6 +167,10 @@ const MainLayout: React.FC<MainLayoutProps> = ({
setNanoBananaParams={setNanoBananaParams}
selectedAspectRatio={selectedAspectRatio}
setSelectedAspectRatio={setSelectedAspectRatio}
isHybridMode={isHybridMode}
setIsHybridMode={setIsHybridMode}
inferenceMode={inferenceMode}
setInferenceMode={setInferenceMode}
/>
</div>
</div>
Expand Down
136 changes: 136 additions & 0 deletions ai/ai-react-app/src/components/Layout/RightSidebar.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ import {
FunctionCallingMode,
UsageMetadata,
ResponseModality,
InferenceMode,
} from "firebase/ai";

export interface ExtendedGenerationConfig extends GenerationConfig {
Expand All @@ -34,6 +35,10 @@ interface RightSidebarProps {
setNanoBananaParams: React.Dispatch<React.SetStateAction<ModelParams>>;
selectedAspectRatio?: string;
setSelectedAspectRatio: (ar?: string) => void;
isHybridMode: boolean;
setIsHybridMode: React.Dispatch<React.SetStateAction<boolean>>;
inferenceMode: InferenceMode;
setInferenceMode: React.Dispatch<React.SetStateAction<InferenceMode>>;
}

const RightSidebar: React.FC<RightSidebarProps> = ({
Expand All @@ -45,7 +50,58 @@ const RightSidebar: React.FC<RightSidebarProps> = ({
setNanoBananaParams,
selectedAspectRatio,
setSelectedAspectRatio,
isHybridMode,
setIsHybridMode,
inferenceMode,
setInferenceMode,
}) => {
const [schemaText, setSchemaText] = React.useState(
JSON.stringify(generativeParams.generationConfig?.responseJsonSchema || {}, null, 2)
);
const [modelStatus, setModelStatus] = React.useState<string>("unknown");

React.useEffect(() => {
setSchemaText(
JSON.stringify(generativeParams.generationConfig?.responseJsonSchema || {}, null, 2)
);
}, [generativeParams.generationConfig?.responseJsonSchema]);

React.useEffect(() => {
if (isHybridMode) {
checkModelAvailability();
}
}, [isHybridMode]);

const checkModelAvailability = async () => {
setModelStatus("checking");
try {
const ai = (window as any).LanguageModel;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The Chrome Prompt API is typically accessed via window.ai.languageModel. Accessing window.LanguageModel directly is likely to fail in standard Chrome environments, causing the availability check to incorrectly report the model as unavailable.

Suggested change
const ai = (window as any).LanguageModel;
const ai = (window as any).ai?.languageModel;

if (!ai) {
setModelStatus("unavailable");
return;
}
const availability = await ai.availability();
setModelStatus(availability);
} catch (err) {
console.error("Error checking model availability:", err);
setModelStatus("error");
}
};

const handleDownloadModel = async () => {
setModelStatus("downloading");
try {
const ai = (window as any).LanguageModel;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Similar to the availability check, the model creation should use the window.ai.languageModel entry point to ensure compatibility with the Chrome Prompt API.

Suggested change
const ai = (window as any).LanguageModel;
const ai = (window as any).ai?.languageModel;

if (ai) {
await ai.create();
setModelStatus("available");
}
} catch (err) {
console.error("Error downloading model:", err);
setModelStatus("error");
}
};

const handleModelParamsUpdate = (
updateFn: (prevState: ModelParams) => ModelParams,
) => {
Expand Down Expand Up @@ -154,6 +210,8 @@ const RightSidebar: React.FC<RightSidebarProps> = ({
if (checked) {
// Turn ON JSON
nextState.generationConfig.responseMimeType = "application/json";
nextState.generationConfig.responseJsonSchema = { type: "object", properties: {} }; // Default schema
nextState.generationConfig.responseSchema = undefined;
Comment on lines +213 to +214
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The assignment to responseSchema on line 214 is redundant because it is performed again on line 217 within the same logical block. Removing the duplicate assignment improves code clarity.

Suggested change
nextState.generationConfig.responseJsonSchema = { type: "object", properties: {} }; // Default schema
nextState.generationConfig.responseSchema = undefined;
nextState.generationConfig.responseJsonSchema = { type: "object", properties: {} }; // Default schema


// Turn OFF Function Calling by clearing its related fields
nextState.generationConfig.responseSchema = undefined;
Expand All @@ -162,6 +220,7 @@ const RightSidebar: React.FC<RightSidebarProps> = ({
} else {
// Turn OFF JSON
nextState.generationConfig.responseMimeType = undefined;
nextState.generationConfig.responseJsonSchema = undefined;
nextState.generationConfig.responseSchema = undefined;
}
} else if (name === "function-call-toggle") {
Expand All @@ -178,6 +237,7 @@ const RightSidebar: React.FC<RightSidebarProps> = ({

// Turn OFF JSON mode by clearing its related fields
nextState.generationConfig.responseMimeType = undefined;
nextState.generationConfig.responseJsonSchema = undefined;
nextState.generationConfig.responseSchema = undefined;
} else {
// Turn OFF Function Calling
Expand All @@ -191,6 +251,7 @@ const RightSidebar: React.FC<RightSidebarProps> = ({

// Turn OFF JSON mode and Function Calling
nextState.generationConfig.responseMimeType = undefined;
nextState.generationConfig.responseJsonSchema = undefined;
nextState.generationConfig.responseSchema = undefined;
nextState.toolConfig = undefined;
} else {
Expand Down Expand Up @@ -342,6 +403,54 @@ const RightSidebar: React.FC<RightSidebarProps> = ({

<div>
<h5 className={styles.subSectionTitle}>Tools</h5>
<div className={styles.toggleGroup}>
<label htmlFor="hybrid-mode-toggle">Hybrid Mode</label>
<label className={styles.switch}>
<input
type="checkbox"
id="hybrid-mode-toggle"
name="hybrid-mode-toggle"
checked={isHybridMode}
onChange={(e) => setIsHybridMode(e.target.checked)}
/>
<span className={styles.slider}></span>
</label>
</div>
{isHybridMode && (
<>
<div style={{ fontSize: '0.8rem', color: '#666', marginTop: '5px', marginBottom: '10px' }}>
To use on-device inference, ensure you have enabled the Prompt API in Chrome and downloaded the model.
See <a href="https://developer.chrome.com/docs/ai/prompt-api" target="_blank" rel="noopener noreferrer">Chrome AI Prompt API</a> for details.
</div>
<div className={styles.controlGroup}>
<label>Model Status</label>
<div style={{ display: 'flex', alignItems: 'center', gap: '10px' }}>
<span style={{ textTransform: 'capitalize' }}>{modelStatus}</span>
{modelStatus === "downloadable" && (
<button onClick={handleDownloadModel} style={{ padding: '2px 5px', fontSize: '0.8rem' }}>
Download
</button>
)}
{modelStatus === "downloading" && (
<span className={styles.spinner}>⏳</span> // Using an emoji as a simple spinner
)}
</div>
</div>
<div className={styles.controlGroup}>
<label htmlFor="inference-mode-select">Inference Mode</label>
<select
id="inference-mode-select"
value={inferenceMode}
onChange={(e) => setInferenceMode(e.target.value as InferenceMode)}
>
<option value={InferenceMode.PREFER_ON_DEVICE}>Prefer On-Device</option>
<option value={InferenceMode.ONLY_ON_DEVICE}>Only On-Device</option>
<option value={InferenceMode.ONLY_IN_CLOUD}>Only In-Cloud</option>
<option value={InferenceMode.PREFER_IN_CLOUD}>Prefer In-Cloud</option>
</select>
</div>
</>
)}
<div
className={`${styles.toggleGroup} ${isFunctionCallingActive ? styles.disabledText : ""}`}
>
Expand All @@ -364,6 +473,33 @@ const RightSidebar: React.FC<RightSidebarProps> = ({
></span>
</label>
</div>
{isStructuredOutputActive && (
<div className={styles.controlGroup} style={{ marginTop: "10px", marginBottom: "10px" }}>
<label htmlFor="json-schema-input" style={{ display: "block", marginBottom: "5px" }}>JSON Schema</label>
<textarea
id="json-schema-input"
rows={5}
placeholder='e.g. { "type": "object", "properties": { "response": { "type": "string" } } }'
value={schemaText}
onChange={(e) => {
setSchemaText(e.target.value);
try {
const schema = JSON.parse(e.target.value);
handleModelParamsUpdate((prev: ModelParams) => ({
...prev,
generationConfig: {
...prev.generationConfig,
responseJsonSchema: schema,
},
}));
} catch (err) {
// Ignore invalid JSON while typing
}
Comment on lines +486 to +497
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The try-catch block silently ignores JSON parsing errors. While this prevents the application from crashing during typing, the user receives no feedback when their schema contains a syntax error. Consider adding a local state to track and display validation errors to the user for a better experience.

}}
style={{ width: "100%", fontFamily: "monospace", fontSize: "0.8rem", padding: "5px", boxSizing: "border-box" }}
/>
</div>
)}
<div
className={`${styles.toggleGroup} ${isStructuredOutputActive || isGroundingWithGoogleSearchActive ? styles.disabledText : ""}`}
>
Expand Down
11 changes: 5 additions & 6 deletions ai/ai-react-app/src/services/firebaseAIService.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,19 +17,18 @@ import {
import { firebaseConfig } from "../config/firebase-config";

export const AVAILABLE_GENERATIVE_MODELS = [
"gemini-2.0-flash",
"gemini-2.0-flash-lite",
"gemini-2.0-flash-exp",
"gemini-2.5-flash"
"gemini-2.5-pro",
"gemini-2.5-flash",
"gemini-2.5-flash-lite"
];
export const AVAILABLE_NANO_BANANA_MODELS = [
"gemini-3-pro-image-preview",
"gemini-3.1-flash-image-preview",
"gemini-2.5-flash-image",
];
export const LIVE_MODELS = new Map<BackendType, string>([
[BackendType.GOOGLE_AI, 'gemini-2.5-flash-native-audio-preview-09-2025'],
[BackendType.VERTEX_AI, 'gemini-live-2.5-flash-preview-native-audio-09-2025']
[BackendType.GOOGLE_AI, 'gemini-2.5-flash-native-audio-preview-12-2025'],
[BackendType.VERTEX_AI, 'gemini-live-2.5-flash-native-audio']
])

let app: FirebaseApp;
Expand Down
Loading
Loading