Provides mood/emotion analysis capabilities to child components. Wraps the HiyveClient sentiment analyzers and exposes a simple API for registering video elements and receiving mood state updates.
There are 4 ways to access participant mood data:
onMoodChange callback - Real-time updates when any participant's mood changesmoodStates Map - Access all mood states via useMoodAnalysis() hookgetMoodState(userId) - Look up a specific participant's mooduseMoodState(userId) - Hook to subscribe to a specific user's moodonMoodChange(userId, moodState) - Called whenever any participant's mood changes.
This is the recommended way to react to mood changes in real-time.onReady() - Called when the analyzer is fully initialized and ready to process.onError(error) - Called when an error occurs during initialization or analysis.{
emotion: 'neutral' | 'happy' | 'sad' | 'angry' | 'fearful' | 'disgusted' | 'surprised',
confidence: number, // 0-1, detection confidence
engagement: number, // 0-1, attention level
offFrame: boolean, // true when face not visible
gracePeriod: boolean, // true during grace period after face left
timestamp: number, // Unix timestamp
enabled: boolean // whether analysis is active
}
Basic usage with onMoodChange callback:
import { MoodAnalysisProvider, useMoodAnalysis } from '@hiyve/react-intelligence';
function App() {
const [moodEnabled, setMoodEnabled] = useState(false);
const handleMoodChange = (userId: string, moodState: MoodState) => {
console.log(`${userId}: ${moodState.emotion} (confidence: ${moodState.confidence})`);
// Example: Track engagement metrics
if (moodState.engagement < 0.3) {
console.log(`${userId} appears distracted`);
}
// Example: Send to analytics
analytics.track('mood_update', {
participantId: userId,
emotion: moodState.emotion,
engagement: moodState.engagement,
});
};
return (
<MoodAnalysisProvider
enabled={moodEnabled}
analyzerType="human"
onMoodChange={handleMoodChange}
onReady={() => console.log('Mood analysis ready!')}
onError={(err) => console.error('Mood error:', err)}
>
<VideoRoom />
</MoodAnalysisProvider>
);
}
Accessing mood data via hooks inside components:
function MoodDashboard() {
const { moodStates, getMoodState, ready } = useMoodAnalysis();
if (!ready) return <div>Loading mood analysis...</div>;
// Method 1: Iterate all participants
const happyParticipants = Array.from(moodStates.entries())
.filter(([_, state]) => state.emotion === 'happy')
.map(([userId]) => userId);
// Method 2: Get specific participant
const hostMood = getMoodState('host-id');
return (
<div>
<p>Happy: {happyParticipants.join(', ')}</p>
<p>Host mood: {hostMood?.emotion ?? 'unknown'}</p>
</div>
);
}
Integration with VideoGrid (automatic tile registration):
import { MoodAnalysisProvider } from '@hiyve/react-intelligence';
import { ConnectedVideoGrid } from '@hiyve/react-ui';
function VideoRoom() {
const [moodEnabled, setMoodEnabled] = useState(false);
return (
<MoodAnalysisProvider
enabled={moodEnabled}
onMoodChange={(userId, state) => {
// Mood data is available here for all participants
// VideoTile components auto-register their video elements
}}
>
<ConnectedVideoGrid showMood={moodEnabled} />
</MoodAnalysisProvider>
);
}
Custom analyzer configuration:
<MoodAnalysisProvider
enabled={true}
analyzerType="mediapipe" // 'human' | 'mediapipe' | 'faceapi'
options={{
detectionInterval: 150, // ms between detections (default: 200)
minConfidence: 0.3, // minimum face detection confidence (default: 0.2)
smoothing: true, // enable emotion smoothing (default: false)
smoothingFactor: 0.3, // smoothing strength (default: 0.3)
noFaceGracePeriod: 2000, // ms before marking off-frame (default: 2000)
debug: false, // enable debug logging (default: false)
}}
onMoodChange={(userId, state) => handleMoodChange(userId, state)}
onReady={() => console.log('Analyzer initialized')}
onError={(err) => console.error('Analyzer error:', err)}
>
{children}
</MoodAnalysisProvider>
MoodAnalysisProvider component.