more redux

convert media and mixer related state in to redux.
This commit is contained in:
Nuwan 2026-01-13 18:17:14 +05:30
parent 25e049c4f5
commit aea78a65a8
20 changed files with 3715 additions and 944 deletions

View File

@ -21,7 +21,19 @@
"Bash(chmod:*)",
"Bash(git mv:*)",
"Bash(./run-phase-tests.sh:*)",
"Bash(npm run test:unit:*)"
"Bash(npm run test:unit:*)",
"Bash(find:*)",
"Bash(npm test:*)",
"Bash(node -e:*)",
"Bash(npm run build:*)",
"Bash(export NODE_OPTIONS=\"--openssl-legacy-provider\")",
"Bash(timeout 180 npm run build:*)",
"Bash(npx jest:*)",
"Bash(npx eslint:*)",
"Bash(lsof:*)",
"Bash(xargs kill -9)",
"Bash(node -c:*)",
"Skill(gsd:new-project)"
]
}
}

132
.planning/PROJECT.md Normal file
View File

@ -0,0 +1,132 @@
# JamKazam Media Features Modernization
**One-liner:** Modernize media opening features (Backing Track, JamTrack, Metronome) from legacy jQuery/Rails to React patterns in jam-ui
## Vision
Transform the media opening workflow from the legacy web project into modern React patterns in jam-ui. The features exist in the legacy codebase with jQuery dialogs and polling-based playback monitoring - we're bringing them into the React architecture with proper component structure, hooks, and Redux integration where appropriate.
**Target Features:**
1. **Backing Track** - File-based audio playback with player controls (FIRST PRIORITY)
2. **JamTrack** - Collaborative track loading with mixdown selection
3. **Metronome** - Tempo/sound configuration and playback
All three features are accessible via the "Open" menu in the session screen top navigation.
## Context
### Current State
**Backing Track (Partially Implemented):**
- ✓ Native file dialog integration via `jamClient.ShowSelectBackingTrackDialog()`
- ✓ File opening via `jamClient.SessionOpenBackingTrackFile()`
- ✓ Track display in session with VU meter, gain, pan controls
- ✓ Basic player modal with play/pause/stop buttons
- ✓ Volume control working
- ✓ Loop toggle implemented
- ✗ **Seek bar** - Exists but is placeholder (hardcoded value="0")
- ✗ **Duration display** - Hardcoded to "0:00"
- ✗ **Current time display** - Hardcoded to "0:00"
**JamTrack:** Not yet implemented in jam-ui
**Metronome:** Not yet implemented in jam-ui
### Architecture Context
- **Frontend:** React 16.13.1 SPA with Redux Toolkit 1.6.1
- **Native Bridge:** C++ desktop client exposed via `jamClientProxy.js` QWebChannel tunnel
- **State Management:** Redux for global state, local component state for transient playback data
- **Legacy Reference:** jQuery-based implementation in `web/app/assets/javascripts/` using 500ms polling
### Recent Work
**Phase 5 Redux Migration (Completed):**
- Migrated MediaContext to Redux (`mediaSlice.js`, `mixersSlice.js`)
- Eliminated duplicate state across contexts and hooks
- Fixed VU meter bug with personal mixer mode flag handling
- Established clean separation: Redux for data, local state for UI
## Requirements
### Validated
- ✓ Native file dialog for track selection - existing pattern works well
- ✓ jamClient API integration - proven in current partial implementation
- ✓ Redux state management architecture - validated in Phase 5 migration
### Active
**Backing Track (Priority 1):**
- [ ] Implement real-time playback monitoring with 500ms polling
- [ ] Display current playback position (MM:SS format)
- [ ] Display total track duration (MM:SS format)
- [ ] Implement functional seek bar with drag-to-position
- [ ] Integrate monitoring lifecycle with play/pause/stop controls
- [ ] Add error handling for edge cases (duration=0, position>duration)
- [ ] Optimize performance (prevent unnecessary re-renders)
**JamTrack (Priority 2):**
- [ ] Research legacy implementation patterns
- [ ] Design React component structure
- [ ] Implement file opening workflow
- [ ] Implement mixdown selection (full track vs. specific mixes)
- [ ] Implement player controls similar to Backing Track
**Metronome (Priority 3):**
- [ ] Research legacy implementation patterns
- [ ] Design React component structure
- [ ] Implement tempo/sound/cricket configuration UI
- [ ] Implement start/stop controls
- [ ] Integrate with metronome mixer state
### Out of Scope
- **React-based track list dialog** - Keep native file dialogs (established pattern)
- **Audio processing logic** - All audio handled by native C++ client
- **New media types** - Focus only on existing three features
- **Refactoring unrelated code** - Minimize changes outside media features
## Key Decisions
| Decision | Rationale | Outcome |
|----------|-----------|---------|
| Keep native file dialogs | Already working, platform-native UX, no need to rebuild | Confirmed |
| Start with Backing Track | Most straightforward, establishes patterns for JamTrack/Metronome | In Progress |
| Use local state for playback monitoring | High-frequency transient data doesn't need global Redux visibility | Pending |
| 500ms polling interval | Matches legacy pattern, balances responsiveness vs. performance | Pending |
| Milliseconds for seek values | Direct mapping to jamClient API, no conversion overhead | Pending |
## Constraints
### Technical
- **Must use jamClient API methods:** SessionGetTracksPlayDurationMs(), SessionCurrrentPlayPosMs(), isSessionTrackPlaying(), SessionTrackSeekMs()
- **React 16 patterns only** - No async/await in useEffect without proper cleanup
- **Maintain 60fps VU meter performance** - Playback monitoring must not degrade mixer rendering
### Business
- **Functional parity with legacy web** - Users expect same capabilities in jam-ui
- **Zero audio glitches** - Seek/playback changes must be seamless
## Critical Files
**Backing Track Implementation:**
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/components/client/JKSessionBackingTrackPlayer.js` - Main work (lines 1-100 reviewed)
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/components/client/JKSessionOpenMenu.js` - Entry point
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/components/client/JKSessionBackingTrack.js` - Track display
**Legacy Reference:**
- `/Users/nuwan/Code/jam-cloud/web/app/assets/javascripts/dialog/openBackingTrackDialog.js` - Pattern reference
- `/Users/nuwan/Code/jam-cloud/web/app/assets/javascripts/playbackControls.js` - Polling/formatting reference
**Redux State:**
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/store/features/activeSessionSlice.js` - backingTrackData
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/store/features/mediaSlice.js` - openBackingTrack thunk
**Native Bridge:**
- `/Users/nuwan/Code/jam-cloud/jam-ui/src/services/jamClientProxy.js` - API reference
---
*Last updated: 2026-01-13 after initialization*

View File

@ -2,7 +2,7 @@ import React, { useState, useEffect, useRef } from 'react';
import { Modal, ModalHeader, ModalBody, ModalFooter, Button, FormGroup, Label, Input } from 'reactstrap';
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
import { faPlay, faPause, faStop, faVolumeUp } from '@fortawesome/free-solid-svg-icons';
import { useMediaContext } from '../../context/MediaContext';
import useMediaActions from '../../hooks/useMediaActions';
const JKSessionBackingTrackPlayer = ({
isOpen,
@ -13,7 +13,7 @@ const JKSessionBackingTrackPlayer = ({
currentUser,
isPopup = false
}) => {
const { closeMedia } = useMediaContext();
const { closeMedia } = useMediaActions();
const [isPlaying, setIsPlaying] = useState(false);
const [isLooping, setIsLooping] = useState(false);
const [volume, setVolume] = useState(100);

View File

@ -1,7 +1,6 @@
import React, { useState, useRef, useEffect, useContext } from 'react';
import { createPortal } from 'react-dom';
import { useJamClient } from '../../context/JamClientContext';
import { useMediaContext } from '../../context/MediaContext';
import { useAuth } from '../../context/UserAuth';
import { toast } from 'react-toastify';
import openIcon from '../../assets/img/client/open.svg';

View File

@ -15,8 +15,8 @@ import { useJamServerContext } from '../../context/JamServerContext.js';
import { useGlobalContext } from '../../context/GlobalContext.js';
import { useJamKazamApp } from '../../context/JamKazamAppContext.js';
import { useMixersContext } from '../../context/MixersContext.js';
import { useMediaContext } from '../../context/MediaContext';
import { useAuth } from '../../context/UserAuth';
import useMediaActions from '../../hooks/useMediaActions';
import { dkeys } from '../../helpers/utils.js';
@ -24,6 +24,7 @@ import { getSessionHistory, getSession, joinSession as joinSessionRest, updateSe
// Redux imports
import { openModal, closeModal, toggleModal, selectModal } from '../../store/features/sessionUISlice';
import { selectMediaSummary } from '../../store/features/mixersSlice';
import {
fetchActiveSession,
joinActiveSession,
@ -128,7 +129,10 @@ const JKSessionScreen = () => {
const { globalObject, metronomeState, closeMetronome, resetMetronome } = useGlobalContext();
const { getCurrentRecordingState, reset: resetRecordingState, currentlyRecording } = useRecordingHelpers();
const { SessionPageEnter } = useSessionUtils();
const { mediaSummary, openBackingTrack, openMetronome, loadJamTrack, closeMedia } = useMediaContext();
// Redux media state and actions
const mediaSummary = useSelector(selectMediaSummary);
const { openBackingTrack, openMetronome, loadJamTrack, closeMedia } = useMediaActions();
// Use the session model hook
const sessionModel = useSessionModel(app, server, null); // sessionScreen is null for now

View File

@ -1,35 +1,54 @@
import React, { useEffect, useState, useCallback } from 'react';
import { useSelector } from 'react-redux';
import { Button, Modal, ModalHeader, ModalBody, ModalFooter, FormGroup, Label, Input, Table, Row, Col } from 'reactstrap';
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
import { faPlay, faPause, faStop, faVolumeUp, faDownload, faEdit, faTrash, faPlus, faMinus, faQuestionCircle } from '@fortawesome/free-solid-svg-icons';
import { useMediaContext } from '../../context/MediaContext';
import { useAuth } from '../../context/UserAuth';
import { useJamClient } from '../../context/JamClientContext';
import { toast } from 'react-toastify';
import useMediaActions from '../../hooks/useMediaActions';
import { selectMediaSummary, selectMetronome } from '../../store/features/mixersSlice';
import {
selectBackingTracks,
selectJamTracks,
selectRecordedTracks,
selectJamTrackState,
selectDownloadingJamTrack
} from '../../store/features/mediaSlice';
import {
selectShowMyMixes,
selectShowCustomMixes,
selectEditingMixdownId,
selectCreatingMixdown,
selectCreateMixdownErrors
} from '../../store/features/sessionUISlice';
const JKPopupMediaControls = ({ onClose }) => {
const { currentUser } = useAuth();
const { jamClient } = useJamClient();
// Redux state
const mediaSummary = useSelector(selectMediaSummary);
const backingTracks = useSelector(selectBackingTracks);
const jamTracks = useSelector(selectJamTracks);
const recordedTracks = useSelector(selectRecordedTracks);
const metronome = useSelector(selectMetronome);
const jamTrackState = useSelector(selectJamTrackState);
const downloadingJamTrack = useSelector(selectDownloadingJamTrack);
const showMyMixes = useSelector(selectShowMyMixes);
const showCustomMixes = useSelector(selectShowCustomMixes);
const editingMixdownId = useSelector(selectEditingMixdownId);
const creatingMixdown = useSelector(selectCreatingMixdown);
const createMixdownErrors = useSelector(selectCreateMixdownErrors);
// Redux actions
const {
mediaSummary,
backingTracks,
jamTracks,
recordedTracks,
metronome,
jamTrackState,
downloadingJamTrack,
showMyMixes,
showCustomMixes,
editingMixdownId,
creatingMixdown,
createMixdownErrors,
closeMedia,
setShowMyMixes,
setShowCustomMixes,
setEditingMixdownId,
setCreatingMixdown,
setCreateMixdownErrors
} = useMediaContext();
toggleMyMixes,
toggleCustomMixes,
editMixdown,
setMixdownErrors
} = useMediaActions();
const [time, setTime] = useState('0:00');
const [isPlaying, setIsPlaying] = useState(false);
@ -70,10 +89,6 @@ const JKPopupMediaControls = ({ onClose }) => {
}
};
// Toggle mix sections
const toggleMyMixes = () => setShowMyMixes(!showMyMixes);
const toggleCustomMixes = () => setShowCustomMixes(!showCustomMixes);
// JamTrack actions
const handleJamTrackPlay = async (jamTrack) => {
try {
@ -85,7 +100,7 @@ const JKPopupMediaControls = ({ onClose }) => {
const handleMixdownPlay = async (mixdown) => {
try {
setEditingMixdownId(null);
editMixdown(null);
await jamClient.JamTrackActivateMixdown(mixdown);
} catch (error) {
console.error('Error playing mixdown:', error);
@ -93,12 +108,12 @@ const JKPopupMediaControls = ({ onClose }) => {
};
const handleMixdownEdit = (mixdown) => {
setEditingMixdownId(mixdown.id);
editMixdown(mixdown.id);
};
const handleMixdownSave = async (mixdown) => {
// Implementation for saving mixdown name
setEditingMixdownId(null);
editMixdown(null);
};
const handleMixdownDelete = async (mixdown) => {
@ -194,7 +209,7 @@ const JKPopupMediaControls = ({ onClose }) => {
defaultValue={mixdown.name}
onKeyDown={(e) => {
if (e.key === 'Enter') handleMixdownSave(mixdown);
if (e.key === 'Escape') setEditingMixdownId(null);
if (e.key === 'Escape') editMixdown(null);
}}
/>
) : (

View File

@ -1,255 +0,0 @@
import React, { createContext, useContext, useState, useEffect, useCallback } from 'react';
import { useJamServerContext } from './JamServerContext';
import { useJamClient } from './JamClientContext';
// Media types constants
export const MEDIA_TYPES = {
BACKING_TRACK: 'backing_track',
JAM_TRACK: 'jam_track',
RECORDING: 'recording',
METRONOME: 'metronome'
};
// Media states
export const MEDIA_STATES = {
CLOSED: 'closed',
LOADING: 'loading',
OPEN: 'open',
ERROR: 'error'
};
const MediaContext = createContext();
export const MediaProvider = ({ children }) => {
// Core media state
const [mediaSummary, setMediaSummary] = useState({
mediaOpen: false,
backingTrackOpen: false,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
isOpener: false,
userNeedsMediaControls: false
});
// Media data
const [backingTracks, setBackingTracks] = useState([]);
const [jamTracks, setJamTracks] = useState([]);
const [recordedTracks, setRecordedTracks] = useState([]);
const [metronome, setMetronome] = useState(null);
const [jamTrackState, setJamTrackState] = useState({});
const [downloadingJamTrack, setDownloadingJamTrack] = useState(false);
// UI state
const [showMyMixes, setShowMyMixes] = useState(false);
const [showCustomMixes, setShowCustomMixes] = useState(false);
const [editingMixdownId, setEditingMixdownId] = useState(null);
const [creatingMixdown, setCreatingMixdown] = useState(false);
const [createMixdownErrors, setCreateMixdownErrors] = useState(null);
// Contexts
const { jamClient } = useJamClient();
const { registerMessageCallback, unregisterMessageCallback } = useJamServerContext();
// Message handlers for real-time updates
const handleMixerChanges = useCallback((sessionMixers) => {
const session = sessionMixers.session;
const mixers = sessionMixers.mixers;
setMediaSummary(prev => ({
...prev,
...mixers.mediaSummary
}));
setBackingTracks(mixers.backingTracks || []);
setJamTracks(mixers.jamTracks || []);
setRecordedTracks(mixers.recordedTracks || []);
setMetronome(mixers.metronome || null);
}, []);
const handleJamTrackChanges = useCallback((changes) => {
setJamTrackState(changes);
}, []);
// NOTE: Disabled automatic WebSocket message handling to prevent infinite popup loops
// Message callbacks will be handled manually by components that need them
// useEffect(() => {
// if (!jamClient) return;
// const callbacks = [
// { type: 'MIXER_CHANGES', callback: handleMixerChanges },
// { type: 'JAM_TRACK_CHANGES', callback: handleJamTrackChanges }
// ];
// callbacks.forEach(({ type, callback }) => {
// registerMessageCallback(type, callback);
// });
// return () => {
// callbacks.forEach(({ type, callback }) => {
// unregisterMessageCallback(type, callback);
// });
// };
// }, [jamClient, registerMessageCallback, unregisterMessageCallback, handleMixerChanges, handleJamTrackChanges]);
// Actions
const openBackingTrack = useCallback(async (file) => {
try {
await jamClient.SessionOpenBackingTrackFile(file, false);
setMediaSummary(prev => ({
...prev,
backingTrackOpen: true,
userNeedsMediaControls: true
}));
} catch (error) {
console.error('Error opening backing track:', error);
throw error;
}
}, [jamClient]);
const closeMedia = useCallback(async (force = false) => {
try {
await jamClient.SessionCloseMedia(force);
setMediaSummary(prev => ({
...prev,
mediaOpen: false,
backingTrackOpen: false,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
userNeedsMediaControls: false
}));
} catch (error) {
console.error('Error closing media:', error);
throw error;
}
}, [jamClient]);
const openMetronome = useCallback(async (bpm = 120, sound = "Beep", meter = 1, mode = 0) => {
try {
const result = await jamClient.SessionOpenMetronome(bpm, sound, meter, mode);
setMediaSummary(prev => ({
...prev,
metronomeOpen: true,
userNeedsMediaControls: true
}));
setMetronome({ bpm, sound, meter, mode });
return result;
} catch (error) {
console.error('Error opening metronome:', error);
throw error;
}
}, [jamClient]);
const closeMetronome = useCallback(async () => {
try {
await jamClient.SessionCloseMetronome();
setMediaSummary(prev => ({
...prev,
metronomeOpen: false
}));
setMetronome(null);
} catch (error) {
console.error('Error closing metronome:', error);
throw error;
}
}, [jamClient]);
// JamTrack actions
const loadJamTrack = useCallback(async (jamTrack) => {
try {
setDownloadingJamTrack(true);
// Load JMep data if available
if (jamTrack.jmep) {
const sampleRate = await jamClient.GetSampleRate();
const sampleRateForFilename = sampleRate === 48 ? '48' : '44';
const fqId = `${jamTrack.id}-${sampleRateForFilename}`;
await jamClient.JamTrackLoadJmep(fqId, jamTrack.jmep);
}
// Play/load the jamtrack
const result = await jamClient.JamTrackPlay(jamTrack.id);
if (!result) {
throw new Error('Unable to open JamTrack');
}
setMediaSummary(prev => ({
...prev,
jamTrackOpen: true,
userNeedsMediaControls: true
}));
setDownloadingJamTrack(false);
return result;
} catch (error) {
setDownloadingJamTrack(false);
console.error('Error loading jam track:', error);
throw error;
}
}, [jamClient]);
const closeJamTrack = useCallback(async () => {
try {
await jamClient.JamTrackStopPlay();
setMediaSummary(prev => ({
...prev,
jamTrackOpen: false
}));
setJamTrackState({});
} catch (error) {
console.error('Error closing jam track:', error);
throw error;
}
}, [jamClient]);
// Context value
const value = {
// State
mediaSummary,
backingTracks,
jamTracks,
recordedTracks,
metronome,
jamTrackState,
downloadingJamTrack,
showMyMixes,
showCustomMixes,
editingMixdownId,
creatingMixdown,
createMixdownErrors,
// Actions
openBackingTrack,
closeMedia,
openMetronome,
closeMetronome,
loadJamTrack,
closeJamTrack,
// UI actions
setShowMyMixes,
setShowCustomMixes,
setEditingMixdownId,
setCreatingMixdown,
setCreateMixdownErrors
};
return (
<MediaContext.Provider value={value}>
{children}
</MediaContext.Provider>
);
};
export const useMediaContext = () => {
const context = useContext(MediaContext);
if (!context) {
throw new Error('useMediaContext must be used within a MediaProvider');
}
return context;
};
export default MediaContext;

View File

@ -0,0 +1,210 @@
import { useCallback } from 'react';
import { useDispatch } from 'react-redux';
import {
openBackingTrack as openBackingTrackThunk,
loadJamTrack as loadJamTrackThunk,
closeMedia as closeMediaThunk,
clearJamTrackState,
updateJamTrackState
} from '../store/features/mediaSlice';
import {
setMetronome,
setMetronomeSettings,
updateMediaSummary
} from '../store/features/mixersSlice';
import {
setShowMyMixes,
toggleMyMixes as toggleMyMixesAction,
setShowCustomMixes,
toggleCustomMixes as toggleCustomMixesAction,
setEditingMixdownId,
setCreatingMixdown,
setCreateMixdownErrors
} from '../store/features/sessionUISlice';
import { useJamServerContext } from '../context/JamServerContext';
/**
* Custom hook that provides Redux-based media actions
* Replaces MediaContext actions with Redux thunks and dispatchers
*/
const useMediaActions = () => {
const dispatch = useDispatch();
const { jamClient } = useJamServerContext();
/**
* Open a backing track file
* @param {string} file - Path to the backing track file
*/
const openBackingTrack = useCallback(async (file) => {
try {
await dispatch(openBackingTrackThunk({ file, jamClient })).unwrap();
// Update media summary
dispatch(updateMediaSummary({
backingTrackOpen: true,
userNeedsMediaControls: true
}));
} catch (error) {
console.error('Error opening backing track:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Close all media (backing tracks, jam tracks, recordings, metronome)
* @param {boolean} force - Force close even if media is playing
*/
const closeMedia = useCallback(async (force = false) => {
try {
await dispatch(closeMediaThunk({ force, jamClient })).unwrap();
// Update media summary
dispatch(updateMediaSummary({
mediaOpen: false,
backingTrackOpen: false,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
userNeedsMediaControls: false
}));
} catch (error) {
console.error('Error closing media:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Open metronome with specified settings
* @param {number} bpm - Beats per minute (default: 120)
* @param {string} sound - Metronome sound type (default: "Beep")
* @param {number} meter - Time signature meter (default: 1)
* @param {number} mode - Metronome mode (default: 0)
*/
const openMetronome = useCallback(async (bpm = 120, sound = "Beep", meter = 1, mode = 0) => {
try {
const result = await jamClient.SessionOpenMetronome(bpm, sound, meter, mode);
// Update Redux state
dispatch(setMetronome({ bpm, sound, meter, mode }));
dispatch(setMetronomeSettings({ tempo: bpm, sound, cricket: mode === 1 }));
dispatch(updateMediaSummary({
metronomeOpen: true,
userNeedsMediaControls: true
}));
return result;
} catch (error) {
console.error('Error opening metronome:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Close the metronome
*/
const closeMetronome = useCallback(async () => {
try {
await jamClient.SessionCloseMetronome();
// Update Redux state
dispatch(setMetronome(null));
dispatch(updateMediaSummary({
metronomeOpen: false
}));
} catch (error) {
console.error('Error closing metronome:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Load and play a JamTrack
* @param {object} jamTrack - JamTrack object with id and optional jmep data
*/
const loadJamTrack = useCallback(async (jamTrack) => {
try {
await dispatch(loadJamTrackThunk({ jamTrack, jamClient })).unwrap();
// Update media summary
dispatch(updateMediaSummary({
jamTrackOpen: true,
userNeedsMediaControls: true
}));
} catch (error) {
console.error('Error loading jam track:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Stop and close the currently playing JamTrack
*/
const closeJamTrack = useCallback(async () => {
try {
await jamClient.JamTrackStopPlay();
// Update Redux state
dispatch(clearJamTrackState());
dispatch(updateMediaSummary({
jamTrackOpen: false
}));
} catch (error) {
console.error('Error closing jam track:', error);
throw error;
}
}, [dispatch, jamClient]);
/**
* Update JamTrack playback state (position, playing, etc.)
* @param {object} changes - JamTrack state changes
*/
const updateJamTrackPlayback = useCallback((changes) => {
dispatch(updateJamTrackState(changes));
}, [dispatch]);
// UI Actions
const toggleMyMixes = useCallback(() => {
dispatch(toggleMyMixesAction());
}, [dispatch]);
const toggleCustomMixes = useCallback(() => {
dispatch(toggleCustomMixesAction());
}, [dispatch]);
const editMixdown = useCallback((mixdownId) => {
dispatch(setEditingMixdownId(mixdownId));
}, [dispatch]);
const startCreatingMixdown = useCallback(() => {
dispatch(setCreatingMixdown(true));
}, [dispatch]);
const stopCreatingMixdown = useCallback(() => {
dispatch(setCreatingMixdown(false));
}, [dispatch]);
const setMixdownErrors = useCallback((errors) => {
dispatch(setCreateMixdownErrors(errors));
}, [dispatch]);
return {
// Core media actions
openBackingTrack,
closeMedia,
openMetronome,
closeMetronome,
loadJamTrack,
closeJamTrack,
updateJamTrackPlayback,
// UI actions
toggleMyMixes,
toggleCustomMixes,
editMixdown,
startCreatingMixdown,
stopCreatingMixdown,
setMixdownErrors
};
};
export default useMediaActions;

View File

@ -1,6 +1,51 @@
import { useState, useEffect, useMemo, useCallback, useRef } from 'react';
import { useSelector } from 'react-redux';
import { useEffect, useMemo, useCallback, useRef } from 'react';
import { useSelector, useDispatch } from 'react-redux';
import { selectActiveSession, selectInSession } from '../store/features/activeSessionSlice';
import {
selectChatMixer,
selectBroadcastMixer,
selectRecordingMixer,
selectRecordingTrackMixers,
selectBackingTrackMixers,
selectJamTrackMixers,
selectMetronomeTrackMixers,
selectAdhocTrackMixers,
selectMasterMixers,
selectPersonalMixers,
selectAllMixers,
selectMixersByResourceId,
selectMixersByTrackId,
selectMetronome,
selectMetronomeSettings,
selectMediaSummary,
selectNoAudioUsers,
selectClientsWithAudioOverride,
selectSimulatedMusicCategoryMixers,
selectSimulatedChatCategoryMixers,
selectMixersReady,
setMasterMixers,
setPersonalMixers,
organizeMixers,
updateMixer,
setChatMixer,
setBackingTracks as setBackingTracksAction,
setJamTracks as setJamTracksAction,
setRecordedTracks as setRecordedTracksAction,
setMetronome as setMetronomeAction,
setMediaSummary,
setSimulatedMusicCategoryMixers as setSimulatedMusicAction,
setSimulatedChatCategoryMixers as setSimulatedChatAction
} from '../store/features/mixersSlice';
import {
selectBackingTracks,
selectJamTracks,
selectRecordedTracks
} from '../store/features/mediaSlice';
import {
selectMixMode,
selectCurrentMixerRange,
setCurrentMixerRange
} from '../store/features/sessionUISlice';
import { ChannelGroupIds, CategoryGroupIds, MIX_MODES, MIDI_TRACK } from '../helpers/globals.js';
import useVuHelpers from './useVuHelpers.js';
import useFaderHelpers from './useFaderHelpers.js';
@ -13,41 +58,45 @@ import { getAvatarUrl, getInstrumentIcon45, getInstrumentIcon24 } from '../helpe
const useMixerHelper = () => {
const dispatch = useDispatch();
const allMixersRef = useRef({});
const [chatMixer, setChatMixer] = useState(null);
const [broadcastMixer, setBroadcastMixer] = useState(null);
const [recordingMixer, setRecordingMixer] = useState(null);
const [recordingTrackMixers, setRecordingTrackMixers] = useState([]);
const [backingTrackMixers, setBackingTrackMixers] = useState([]);
const [jamTrackMixers, setJamTrackMixers] = useState([]);
const [metronomeTrackMixers, setMetronomeTrackMixers] = useState([]);
const [adhocTrackMixers, setAdhocTrackMixers] = useState([]);
const [backingTracks, setBackingTracks] = useState([]);
const [jamTracks, setJamTracks] = useState([]);
const [recordedTracks, setRecordedTracks] = useState([]);
const [metronome, setMetronome] = useState(null);
const [mediaSummary, setMediaSummary] = useState({});
// Redux selectors - replace all useState calls
const chatMixer = useSelector(selectChatMixer);
const broadcastMixer = useSelector(selectBroadcastMixer);
const recordingMixer = useSelector(selectRecordingMixer);
const recordingTrackMixers = useSelector(selectRecordingTrackMixers);
const backingTrackMixers = useSelector(selectBackingTrackMixers);
const jamTrackMixers = useSelector(selectJamTrackMixers);
const metronomeTrackMixers = useSelector(selectMetronomeTrackMixers);
const adhocTrackMixers = useSelector(selectAdhocTrackMixers);
const masterMixers = useSelector(selectMasterMixers);
const personalMixers = useSelector(selectPersonalMixers);
const allMixers = useSelector(selectAllMixers);
const mixersByResourceId = useSelector(selectMixersByResourceId);
const mixersByTrackId = useSelector(selectMixersByTrackId);
const metronome = useSelector(selectMetronome);
const metronomeSettings = useSelector(selectMetronomeSettings);
const mediaSummary = useSelector(selectMediaSummary);
const noAudioUsers = useSelector(selectNoAudioUsers);
const clientsWithAudioOverride = useSelector(selectClientsWithAudioOverride);
const simulatedMusicCategoryMixers = useSelector(selectSimulatedMusicCategoryMixers);
const simulatedChatCategoryMixers = useSelector(selectSimulatedChatCategoryMixers);
const isReadyRedux = useSelector(selectMixersReady);
// Media data from mediaSlice
const backingTracks = useSelector(selectBackingTracks);
const jamTracks = useSelector(selectJamTracks);
const recordedTracks = useSelector(selectRecordedTracks);
// UI state from sessionUISlice
const mixMode = useSelector(selectMixMode);
const currentMixerRange = useSelector(selectCurrentMixerRange);
const [session, setSession] = useState(null);
const [masterMixers, setMasterMixers] = useState([]);
const [personalMixers, setPersonalMixers] = useState([]);
const [metro, setMetro] = useState(false);
const [noAudioUsers, setNoAudioUsers] = useState([]);
const [clientsWithAudioOverride, setClientsWithAudioOverride] = useState([]);
const [mixMode, setMixMode] = useState(MIX_MODES.PERSONAL);
const [mixersByResourceId, setMixersByResourceId] = useState({});
const [mixersByTrackId, setMixersByTrackId] = useState({});
const [allMixers, setAllMixers] = useState({});
const [currentMixerRangeMin, setCurrentMixerRangeMin] = useState(null);
const [currentMixerRangeMax, setCurrentMixerRangeMax] = useState(null);
const mediaTrackGroups = useMemo(() => [ChannelGroupIds.MediaTrackGroup, ChannelGroupIds.JamTrackGroup,
ChannelGroupIds.MetronomeGroup], []);
const muteBothMasterAndPersonalGroups = useMemo(() => [ChannelGroupIds.AudioInputMusicGroup, ChannelGroupIds.MidiInputMusicGroup, ChannelGroupIds.MediaTrackGroup,
ChannelGroupIds.JamTrackGroup, ChannelGroupIds.MetronomeGroup], []);
const [simulatedMusicCategoryMixers, setSimulatedMusicCategoryMixers] = useState({});
const [simulatedChatCategoryMixers, setSimulatedChatCategoryMixers] = useState({});
// Phase 4: Replace CurrentSessionContext with Redux
const currentSession = useSelector(selectActiveSession);
@ -70,7 +119,10 @@ const useMixerHelper = () => {
}, [allMixers]);
const getMixer = (mixerId, mode) => {
mode = mode || mixMode;
// Only default to mixMode if mode is undefined, not if it's explicitly false
if (mode === undefined) {
mode = mixMode;
}
return allMixersRef.current[(mode ? 'M' : 'P') + mixerId];
}
@ -101,10 +153,14 @@ const useMixerHelper = () => {
loop: mixer.loop
});
setCurrentMixerRangeMin(mixer.range_low);
setCurrentMixerRangeMax(mixer.range_high);
// Redux: Update mixer range in sessionUISlice
dispatch(setCurrentMixerRange({
min: mixer.range_low,
max: mixer.range_high
}));
return mixer;
}, [getMixer]);
}, [getMixer, setTrackVolumeObject, dispatch]);
const setMixerVolume = useCallback(async (mixer, volumePercent, relative, originalVolume, controlGroup) => {
const newVolume = faderHelpers.convertPercentToAudioTaper(volumePercent);
@ -142,7 +198,7 @@ const useMixerHelper = () => {
console.log("setMixerVolume: setting session mixer volume for mixer", mixer.id, "mode", mixer.mode, "volume", updatedTrackVolumeObject.volL);
await jamClient.SessionSetTrackVolumeData(mixer.id, mixer.mode, updatedTrackVolumeObject);
}
}, [trackVolumeObject, faderHelpers, jamClient]);
}, [trackVolumeObject, faderHelpers, jamClient, setTrackVolumeObject]);
const mediaMixers = useCallback((masterMixer, isOpener, currentAllMixers) => {
const personalMixer = isOpener ? getMixerByResourceId(masterMixer.rid, MIX_MODES.PERSONAL, currentAllMixers) : masterMixer;
@ -162,80 +218,32 @@ const useMixerHelper = () => {
};
}, []);
// Redux: updateMixerData now dispatches Redux actions
const updateMixerData = useCallback((session, masterMixers, personalMixers, metro, noAudioUsers, clientsWithAudioOverride, mixMode) => {
//console.debug("useMixerHelper: updateMixerData called", { session, masterMixers, personalMixers, mixMode });
setSession(session); //sessin is set to sessionHelper
setMasterMixers(masterMixers);
setPersonalMixers(personalMixers);
setMetro(metro);
setNoAudioUsers(noAudioUsers);
setClientsWithAudioOverride(clientsWithAudioOverride);
setMixMode(mixMode);
}, []);
// Dispatch Redux actions instead of setState calls
dispatch(setMasterMixers(masterMixers));
dispatch(setPersonalMixers(personalMixers));
// Note: session is tracked in activeSessionSlice, metro/noAudioUsers/clientsWithAudioOverride in mixersSlice
// mixMode is tracked in sessionUISlice
}, [dispatch]);
// Redux: organizeMixers is now a reducer, trigger it via action
useEffect(() => {
if (!session || masterMixers.length === 0 || personalMixers.length === 0) return;
organizeMixers();
}, [session, masterMixers, personalMixers]);
if (!currentSession || masterMixers.length === 0 || personalMixers.length === 0) return;
dispatch(organizeMixers());
}, [currentSession, masterMixers, personalMixers, dispatch]);
const organizeMixers = () => {
const newAllMixers = {};
const newMixersByResourceId = {};
const newMixersByTrackId = {};
for (const masterMixer of masterMixers) {
newAllMixers['M' + masterMixer.id] = masterMixer;
const mixerPair = {};
newMixersByResourceId[masterMixer.rid] = mixerPair;
newMixersByTrackId[masterMixer.id] = mixerPair;
mixerPair.master = masterMixer;
}
for (const personalMixer of personalMixers) {
newAllMixers['P' + personalMixer.id] = personalMixer;
let mixerPair = newMixersByResourceId[personalMixer.rid];
if (!mixerPair) {
if (personalMixer.group_id !== ChannelGroupIds.MonitorGroup) {
logger.warn("there is no master version of ", personalMixer);
}
mixerPair = {};
newMixersByResourceId[personalMixer.rid] = mixerPair;
}
newMixersByTrackId[personalMixer.id] = mixerPair;
mixerPair.personal = personalMixer;
}
//console.log("useMixerHelper: setting all mixers", newAllMixers);
setAllMixers(prev => {
return { ...prev, ...newAllMixers };
});
setMixersByResourceId(prev => {
return { ...prev, ...newMixersByResourceId };
});
setMixersByTrackId(prev => {
return { ...prev, ...newMixersByTrackId };
});
const newChatMixer = resolveChatMixer(newMixersByResourceId, newAllMixers);
setChatMixer(newChatMixer);
}
// Note: groupMixersByType logic is handled by WebSocket MIXER_CHANGES handler
// which updates backingTrackMixers, jamTrackMixers, etc. via Redux actions
// Sync local isReady ref with Redux isReady state
// This ensures VU meter callbacks have immediate access to the ready state
useEffect(() => {
if (!session || Object.keys(allMixers).length === 0) return;
groupTypes(allMixers);
}, [session, allMixers]);
useEffect(() => {
if (Object.keys(allMixers).length > 0 && !isReady.current) {
// console.log("useMixerHelper: isReady set to true");
isReady.current = true;
}
}, [allMixers, isReady]);
isReady.current = isReadyRedux;
// console.log("useMixerHelper: isReady synced with Redux", isReadyRedux);
}, [isReadyRedux]);
const getMixerByTrackId = useCallback((trackId, mode) => {
const mixerPair = mixersByTrackId[trackId];
@ -544,467 +552,7 @@ const useMixerHelper = () => {
const setMixerPan = useCallback(async (mixer, panPercent) => {
trackVolumeObject.pan = panHelpers.convertPercentToPan(panPercent);
await jamClient.SessionSetTrackVolumeData(mixer.id, mixer.mode, trackVolumeObject);
}, [trackVolumeObject, panHelpers]);
const resolveBackingTracks = useCallback((currentBackingTrackMixers) => {
const backingTracks = [];
if (currentBackingTrackMixers.length === 0) return backingTracks;
let serverBackingTracks = [];
let backingTrackMixers = currentBackingTrackMixers;
if (session.isPlayingRecording()) {
backingTrackMixers = context._.filter(backingTrackMixers, (mixer) => mixer.managed || !mixer.managed);
serverBackingTracks = session.recordedBackingTracks();
} else {
serverBackingTracks = session.backingTracks();
backingTrackMixers = context._.filter(backingTrackMixers, (mixer) => !mixer.managed);
if (backingTrackMixers.length > 1) {
logger.error("multiple, managed backing track mixers encountered", backingTrackMixers);
console.warn("Multiple Backing Tracks Encountered: Only one backing track can be open at a time.");
return backingTracks;
}
}
if (!serverBackingTracks || serverBackingTracks.length === 0) {
return backingTracks;
}
let noCorrespondingTracks = false;
for (const mixer of backingTrackMixers) {
const correspondingTracks = [];
noCorrespondingTracks = false;
if (session.isPlayingRecording()) {
for (const backingTrack of serverBackingTracks) {
if (mixer.persisted_track_id === backingTrack.client_track_id || mixer.id === 'L' + backingTrack.client_track_id) {
correspondingTracks.push(backingTrack);
}
}
} else {
correspondingTracks.push(serverBackingTracks[0]);
}
if (correspondingTracks.length === 0) {
noCorrespondingTracks = true;
logger.debug("renderBackingTracks: could not map backing tracks");
console.warn("Unable to Open Backing Track: Could not correlate server and client tracks");
break;
}
const serverBackingTrack = correspondingTracks[0];
const oppositeMixer = getMixerByResourceId(mixer.rid, MIX_MODES.PERSONAL, allMixers);
const isOpener = mixer.group_id === ChannelGroupIds.MediaTrackGroup;
const data = {
isOpener: isOpener,
shortFilename: context.JK.getNameOfFile(serverBackingTrack.filename),
instrumentIcon: getInstrumentIcon45(serverBackingTrack.instrument_id),
photoUrl: "/assets/content/icon_recording.png",
showLoop: isOpener && !session.isPlayingRecording(),
track: serverBackingTrack,
mixers: mediaMixers(mixer, isOpener, allMixers)
};
backingTracks.push(data);
}
return backingTracks;
}, [session]);
const resolveJamTracks = useCallback((currentJamTrackMixers) => {
const _jamTracks = [];
if (currentJamTrackMixers.length === 0) return _jamTracks;
let jamTrackMixers = currentJamTrackMixers.slice();
let jamTracks = [];
let jamTrackName = null;
const jamTrackMixdown = session?.jamTrackMixdown() || { id: null };
if (session?.isPlayingRecording()) {
jamTracks = session.recordedJamTracks();
jamTrackName = session.recordedJamTrackName();
} else {
jamTracks = session?.jamTracks();
jamTrackName = session?.jamTrackName();
}
const isOpener = jamTrackMixers[0]?.group_id === ChannelGroupIds.JamTrackGroup;
if (jamTracks) {
let noCorrespondingTracks = false;
if (jamTrackMixdown.id) {
logger.debug("MixerHelper: mixdown is active. id: #{jamTrackMixdown.id}");
if (jamTrackMixers.length === 0) {
noCorrespondingTracks = true;
logger.error("could not correlate mixdown tracks", jamTrackMixers, jamTrackMixdown);
// session?.app?.notify({
// title: "Unable to Open Custom Mix",
// text: "Could not correlate server and client tracks",
// icon_url: "/assets/content/icon_alert_big.png"
// });
return _jamTracks;
} else if (jamTrackMixers.length > 1) {
logger.warn("ignoring wrong amount of mixers for JamTrack in mixdown mode");
return _jamTracks;
} else {
const instrumentIcon = getInstrumentIcon24('other');
const part = null;
const instrumentName = 'Custom Mix';
const trackName = 'Custom Mix';
const data = {
name: jamTrackName,
trackName: trackName,
part: part,
isOpener: isOpener,
instrumentIcon: instrumentIcon,
track: jamTrackMixdown,
mixers: mediaMixers(jamTrackMixers[0], isOpener, allMixers)
};
_jamTracks.push(data);
}
} else {
logger.debug("MixerHelper: full jamtrack is active");
if (jamTrackMixers.length === 1) {
logger.warn("ignoring wrong amount of mixers for JamTrack in Full Track mode");
return _jamTracks;
}
for (const jamTrack of jamTracks) {
let mixer = null;
const correspondingTracks = [];
for (const matchMixer of currentJamTrackMixers) {
if (matchMixer.id === jamTrack.id) {
correspondingTracks.push(jamTrack);
mixer = matchMixer;
}
}
if (correspondingTracks.length === 0) {
noCorrespondingTracks = true;
logger.error("could not correlate jam tracks", jamTrackMixers, jamTracks);
// session?.app?.notify({
// title: "Unable to Open JamTrack",
// text: "Could not correlate server and client tracks",
// icon_url: "/assets/content/icon_alert_big.png"
// });
return _jamTracks;
}
jamTrackMixers.splice(jamTrackMixers.indexOf(mixer), 1);
const oneOfTheTracks = correspondingTracks[0];
const instrumentIcon = getInstrumentIcon24(oneOfTheTracks.instrument.id);
const part = oneOfTheTracks.part;
let instrumentName = oneOfTheTracks.instrument.description;
let trackName;
if (part) {
trackName = `${instrumentName}: ${part}`;
} else {
trackName = instrumentName;
}
if (jamTrack.track_type === 'Click') {
trackName = 'Clicktrack';
}
const data = {
name: jamTrackName,
trackName: trackName,
part: part,
isOpener: isOpener,
instrumentIcon: instrumentIcon,
track: oneOfTheTracks,
mixers: mediaMixers(mixer, isOpener, allMixers)
};
_jamTracks.push(data);
}
}
}
return _jamTracks;
}, [session, allMixers]);
const resolveRecordedTracks = useCallback((currentRecordingTrackMixers, currentAllMixers) => {
const recordedTracks = [];
if (currentRecordingTrackMixers.length === 0) return recordedTracks;
const serverRecordedTracks = session?.recordedTracks();
const isOpener = currentRecordingTrackMixers[0]?.group_id === ChannelGroupIds.MediaTrackGroup;
if (serverRecordedTracks) {
const recordingName = session?.recordingName();
let noCorrespondingTracks = false;
for (const mixer of currentRecordingTrackMixers) {
const correspondingTracks = [];
for (const recordedTrack of serverRecordedTracks) {
if (mixer.id.indexOf("L") === 0) {
if (mixer.id.substring(1) === recordedTrack.client_track_id) {
correspondingTracks.push(recordedTrack);
}
} else if (mixer.id.indexOf("C") === 0) {
if (mixer.id.substring(1) === recordedTrack.client_id) {
correspondingTracks.push(recordedTrack);
}
} else {
alert("Invalid state: the recorded track had neither persisted_track_id or persisted_client_id");
}
}
if (correspondingTracks.length === 0) {
noCorrespondingTracks = true;
// session?.app?.notify({
// title: "Unable to Open Recording",
// text: "Could not correlate server and client tracks",
// icon_url: "/assets/content/icon_alert_big.png"
// });
return recordedTracks;
}
const oneOfTheTracks = correspondingTracks[0];
const instrumentIcon = getInstrumentIcon24(oneOfTheTracks.instrument_id);
let userName = oneOfTheTracks.user.name;
if (!userName) {
userName = oneOfTheTracks.user.first_name + ' ' + oneOfTheTracks.user.last_name;
}
const data = {
recordingName: recordingName,
isOpener: isOpener,
userName: userName,
instrumentIcon: instrumentIcon,
track: oneOfTheTracks,
mixers: mediaMixers(mixer, isOpener, currentAllMixers)
};
recordedTracks.push(data);
}
}
return recordedTracks;
}, [session]);
const resolveMetronome = useCallback((currentMetronomeTrackMixers, currentAllMixers) => {
if (currentMetronomeTrackMixers.length === 0) return null;
const mixer = currentMetronomeTrackMixers[0];
const instrumentIcon = "/assets/content/icon_metronome.png";
const metronome = {
instrumentIcon: instrumentIcon,
mixers: mediaMixers(mixer, true, currentAllMixers)
};
return metronome;
}, []);
const resolveChatMixer = useCallback((currentMixersByResourceId, currentAllMixers) => {
const masterChatMixers = mixersForGroupId(ChannelGroupIds.AudioInputChatGroup, MIX_MODES.MASTER, currentAllMixers);
if (masterChatMixers.length === 0) return null;
const personalChatMixers = mixersForGroupId(ChannelGroupIds.AudioInputChatGroup, MIX_MODES.PERSONAL, currentAllMixers);
if (personalChatMixers.length === 0) {
logger.warn("unable to find personal mixer for voice chat");
return null;
}
const masterChatMixer = masterChatMixers[0];
const personalChatMixer = personalChatMixers[0];
return {
master: {
mixer: masterChatMixer,
muteMixer: masterChatMixer,
vuMixer: masterChatMixer,
oppositeMixer: personalChatMixer
},
personal: {
mixer: personalChatMixer,
muteMixer: personalChatMixer,
vuMixer: personalChatMixer,
oppositeMixer: masterChatMixer
}
};
}, []);
const groupTypes = useCallback(() => {
// console.debug("useMixerHelper: groupTypes called", {session, allMixers});
if(!session || Object.keys(allMixers).length === 0) return;
const localMediaMixers = mixersForGroupIds(mediaTrackGroups, MIX_MODES.MASTER, allMixers);
const peerLocalMediaMixers = mixersForGroupId(ChannelGroupIds.PeerMediaTrackGroup, MIX_MODES.MASTER, allMixers);
const newRecordingTrackMixers = [];
const newBackingTrackMixers = [];
const newJamTrackMixers = [];
const newMetronomeTrackMixers = [];
const newAdhocTrackMixers = [];
const groupByType = (mixers, isLocalMixer) => {
for (const mixer of mixers) {
const mediaType = mixer.media_type;
const groupId = mixer.group_id;
if (mediaType === 'MetronomeTrack' || groupId === ChannelGroupIds.MetronomeGroup) {
newMetronomeTrackMixers.push(mixer);
} else if (mediaType === null || mediaType === "" || mediaType === 'RecordingTrack') {
let isJamTrack = false;
if (mixer.id === session.jamTrackMixdown()?.id) {
isJamTrack = true;
}
if (!isJamTrack && session.jamTracks()) {
for (const jamTrack of session.jamTracks()) {
if (mixer.id === jamTrack.id) {
isJamTrack = true;
break;
}
}
}
if (!isJamTrack && session.recordedJamTracks()) {
for (const recordedJamTrack of session.recordedJamTracks()) {
if (mixer.id === recordedJamTrack.id) {
isJamTrack = true;
break;
}
}
}
if (isJamTrack) {
newJamTrackMixers.push(mixer);
} else {
let isBackingTrack = false;
if (session.recordedBackingTracks()) {
for (const recordedBackingTrack of session.recordedBackingTracks()) {
if (mixer.id === 'L' + recordedBackingTrack.client_track_id) {
isBackingTrack = true;
break;
}
}
}
if (session.backingTracks()) {
for (const backingTrack of session.backingTracks()) {
if (mixer.id === 'L' + backingTrack.client_track_id) {
isBackingTrack = true;
break;
}
}
}
if (isBackingTrack) {
newBackingTrackMixers.push(mixer);
} else {
newRecordingTrackMixers.push(mixer);
}
}
} else if (mediaType === 'PeerMediaTrack' || mediaType === 'BackingTrack') {
newBackingTrackMixers.push(mixer);
} else if (mediaType === 'JamTrack') {
newJamTrackMixers.push(mixer);
} else if (mediaType === null || mediaType === "" || mediaType === 'RecordingTrack') {
newRecordingTrackMixers.push(mixer);
} else {
if (mediaType !== 'Broadcast') {
logger.warn("Unknown track type: " + mediaType);
newAdhocTrackMixers.push(mixer);
}
}
}
};
groupByType(localMediaMixers, true);
groupByType(peerLocalMediaMixers, false);
setRecordingTrackMixers(newRecordingTrackMixers);
setBackingTrackMixers(newBackingTrackMixers);
setJamTrackMixers(newJamTrackMixers);
setMetronomeTrackMixers(newMetronomeTrackMixers);
setAdhocTrackMixers(newAdhocTrackMixers);
const newBackingTracks = resolveBackingTracks(newBackingTrackMixers, allMixers);
const newJamTracks = resolveJamTracks(newJamTrackMixers, allMixers);
const newRecordedTracks = resolveRecordedTracks(newRecordingTrackMixers, allMixers);
const newMetronome = resolveMetronome(newMetronomeTrackMixers, allMixers);
setBackingTracks(newBackingTracks);
setJamTracks(newJamTracks);
setRecordedTracks(newRecordedTracks);
setMetronome(newMetronome);
const newMediaSummary = {
recordingOpen: newRecordedTracks.length > 0,
jamTrackOpen: newJamTracks.length > 0,
backingTrackOpen: newBackingTracks.length > 0,
metronomeOpen: session.isMetronomeOpen()
};
let mediaOpenSummary = false;
for (const mediaType in newMediaSummary) {
if (newMediaSummary[mediaType]) {
mediaOpenSummary = true;
break;
}
}
newMediaSummary.mediaOpen = mediaOpenSummary;
newMediaSummary.userNeedsMediaControls = newMediaSummary.mediaOpen || window.JamTrackStore?.jamTrack;
newMediaSummary.jamTrack = window.JamTrackStore?.jamTrack;
let isOpener = false;
if (newMediaSummary.recordingOpen) {
isOpener = newRecordedTracks[0].isOpener;
} else if (newMediaSummary.jamTrackOpen) {
isOpener = newJamTracks[0].isOpener;
} else if (newMediaSummary.backingTrackOpen) {
isOpener = newBackingTracks[0].isOpener;
}
newMediaSummary.isOpener = isOpener;
setMediaSummary(newMediaSummary);
prepareSimulatedMixers(allMixers);
}, [session, allMixers]);
const mixersForGroupIds = useCallback((groupIds, mixMode, currentAllMixers) => {
const foundMixers = [];
const modePrefix = mixMode === MIX_MODES.MASTER ? 'M' : 'P';
for (const mixerKey in currentAllMixers) {
if (mixerKey.startsWith(modePrefix)) {
const mixer = currentAllMixers[mixerKey];
if (mixer && groupIds.includes(mixer.group_id)) {
foundMixers.push(mixer);
}
}
}
return foundMixers;
}, []);
}, [trackVolumeObject, panHelpers, jamClient]);
const getMixerByResourceId = useCallback((resourceId, mode) => {
const mixerPair = mixersByResourceId[resourceId];
@ -1022,8 +570,6 @@ const useMixerHelper = () => {
}
}, [mixersByResourceId]);
const mute = useCallback(async (mixerId, mode, muting) => {
if (mode == null) { mode = mixMode; }
@ -1089,16 +635,16 @@ const useMixerHelper = () => {
await setMixerVolume(mixer, data.percentage, relative, originalVolume, controlGroup, allMixers);
// Update local mixer state - create new state object to trigger React re-renders
setAllMixers(prev => ({
...prev,
[`${mixer.mode ? 'M' : 'P'}${mixer.id}`]: {
...prev[`${mixer.mode ? 'M' : 'P'}${mixer.id}`],
// Redux: Update local mixer state via dispatch
dispatch(updateMixer({
mixerId: mixer.id,
mode: mixer.mode,
updates: {
volume_left: trackVolumeObject.volL
}
}));
}
}, [getOriginalVolume, getMixer, fillTrackVolumeObject, setMixerVolume, allMixers]);
}, [getOriginalVolume, getMixer, fillTrackVolumeObject, setMixerVolume, allMixers, trackVolumeObject, dispatch]);
const initGain = useCallback((mixer) => {
if (Array.isArray(mixer)) {
@ -1125,13 +671,13 @@ const useMixerHelper = () => {
result.push(trackVolumeObject.pan);
}
return result;
}, [getMixer, fillTrackVolumeObject, setMixerPan, allMixers]);
}, [getMixer, fillTrackVolumeObject, setMixerPan, allMixers, trackVolumeObject]);
const initPan = useCallback((mixer) => {
const panPercent = panHelpers.convertPanToPercent(mixer.pan);
faderHelpers.setFaderValue(mixer.id, panPercent, Math.abs(mixer.pan));
faderHelpers.showFader(mixer.id);
}, [faderHelpers]);
}, [faderHelpers, panHelpers]);
const loopChanged = useCallback(async (mixer, shouldLoop) => {
fillTrackVolumeObject(mixer.id, mixer.mode, allMixers);
@ -1144,7 +690,7 @@ const useMixerHelper = () => {
const updatedMixer = getMixer(mixer.id, mixer.mode, allMixers);
updatedMixer.loop = context.trackVolumeObject.loop;
}, [getMixer, fillTrackVolumeObject, allMixers]);
}, [getMixer, fillTrackVolumeObject, allMixers, setTrackVolumeObject]);
const percentFromMixerValue = useCallback((min, max, value) => {
try {
@ -1202,81 +748,26 @@ const useMixerHelper = () => {
// console.log("useMixerHelper: updateVU mixer", allMixersRef.current, mixerId, mode, mixer);
updateVU3(mixer, leftValue, leftClipping, rightValue, rightClipping);
}
}, []);
}, [getMixer, updateVU3]);
const getTrackInfo = useCallback(async () => {
return await context.JK.TrackHelpers.getTrackInfo(context.jamClient, masterMixers);
}, [masterMixers]);
const prepareSimulatedMixers = useCallback(() => {
const newSimulatedMusicCategoryMixers = {};
const newSimulatedChatCategoryMixers = {};
const mixersForGroupIds = useCallback((groupIds, mixMode, currentAllMixers) => {
const foundMixers = [];
const modePrefix = mixMode === MIX_MODES.MASTER ? 'M' : 'P';
newSimulatedMusicCategoryMixers[MIX_MODES.MASTER] = getSimulatedMusicCategoryMixer(MIX_MODES.MASTER, allMixers);
newSimulatedMusicCategoryMixers[MIX_MODES.PERSONAL] = getSimulatedMusicCategoryMixer(MIX_MODES.PERSONAL, allMixers);
// Call getSimulatedChatCategoryMixer once and assign parts like web project
const chatMixerData = getSimulatedChatCategoryMixer(allMixers);
newSimulatedChatCategoryMixers[MIX_MODES.MASTER] = chatMixerData?.master;
newSimulatedChatCategoryMixers[MIX_MODES.PERSONAL] = chatMixerData?.personal;
setSimulatedMusicCategoryMixers(newSimulatedMusicCategoryMixers);
setSimulatedChatCategoryMixers(newSimulatedChatCategoryMixers);
}, [allMixers]);
const getSimulatedMusicCategoryMixer = useCallback((mode, currentAllMixers) => {
const myInputs = getAudioInputCategoryMixer(mode, currentAllMixers)?.mixer;
const peerInputs = getUserMusicCategoryMixer(mode, currentAllMixers)?.mixer;
const myMedia = getMediaCategoryMixer(mode, currentAllMixers)?.mixer;
const peerMedia = getUserMediaCategoryMixer(mode, currentAllMixers)?.mixer;
const metronome = getMetronomeCategoryMixer(mode, currentAllMixers)?.mixer;
const output = getOutputMixer(mode, currentAllMixers);
const oppositeOutput = getOutputMixer(!mode, currentAllMixers);
if (myInputs) {
return {
first: myInputs,
mixer: [myInputs, peerInputs, myMedia, peerMedia, metronome].filter(m => m),
muteMixer: [myInputs, peerInputs, myMedia, peerMedia, metronome].filter(m => m),
vuMixer: output
};
} else {
return null;
}
}, [getAudioInputCategoryMixer, getUserMusicCategoryMixer, getMediaCategoryMixer, getUserMediaCategoryMixer, getMetronomeCategoryMixer, getOutputMixer]);
const getSimulatedChatCategoryMixer = useCallback((currentAllMixers) => {
// Find chat mixers by group_id like the web project does
const masterChatMixers = mixersForGroupId(ChannelGroupIds.AudioInputChatGroup, MIX_MODES.MASTER, currentAllMixers);
const personalChatMixers = mixersForGroupId(ChannelGroupIds.AudioInputChatGroup, MIX_MODES.PERSONAL, currentAllMixers);
if (masterChatMixers.length === 0 || personalChatMixers.length === 0) {
return null;
}
const masterChatMixer = masterChatMixers[0];
const personalChatMixer = personalChatMixers[0];
console.debug("useMixerHelper: getSimulatedChatCategoryMixer", masterChatMixer, personalChatMixer);
// Return the same structure as web project's resolveChatMixer
return {
master: {
mixer: masterChatMixer,
muteMixer: masterChatMixer,
vuMixer: masterChatMixer,
oppositeMixer: personalChatMixer
},
personal: {
mixer: personalChatMixer,
muteMixer: personalChatMixer,
vuMixer: personalChatMixer,
oppositeMixer: masterChatMixer
for (const mixerKey in currentAllMixers) {
if (mixerKey.startsWith(modePrefix)) {
const mixer = currentAllMixers[mixerKey];
if (mixer && groupIds.includes(mixer.group_id)) {
foundMixers.push(mixer);
}
}
};
}
return foundMixers;
}, []);
const refreshMixer = useCallback((mixers, currentAllMixers) => {
@ -1329,17 +820,17 @@ const useMixerHelper = () => {
}, [getMixer]);
const recordingName = useCallback(() => {
return session.recordingName();
}, [session]);
return currentSession.recordingName();
}, [currentSession]);
const jamTrackName = useCallback(() => {
return session.jamTrackName();
}, [session]);
return currentSession.jamTrackName();
}, [currentSession]);
return {
isReady,
session,
session: currentSession, // Return currentSession from Redux instead of local state
// masterMixers,
// personalMixers,
mixers: allMixers,

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,11 @@
import { useState, useCallback, useEffect, useRef } from 'react';
import { useSelector } from 'react-redux';
import { selectInSession } from '../store/features/activeSessionSlice';
import {
selectMetronomeSettings,
selectNoAudioUsers,
selectClientsWithAudioOverride
} from '../store/features/mixersSlice';
import { throttle } from 'lodash'; // Add lodash for throttling
//import { useJamClient } from '../context/JamClientContext';
import { MIX_MODES } from '../helpers/globals';
@ -19,6 +24,12 @@ export default function useMixerStore() {
//const jamClient = useJamClient();
// Phase 4: Replace CurrentSessionContext with Redux
const inSession = useSelector(selectInSession);
// Phase 5: Replace duplicate state with Redux selectors
const metro = useSelector(selectMetronomeSettings);
const noAudioUsers = useSelector(selectNoAudioUsers);
const clientsWithAudioOverride = useSelector(selectClientsWithAudioOverride);
const {
jamClient,
} = useJamServerContext();
@ -27,13 +38,9 @@ export default function useMixerStore() {
const logger = console; // Replace with your logging mechanism if needed
//const { updateVU } = useVuHelpers();
// State management
const [metro, setMetro] = useState({ tempo: 120, cricket: false, sound: "Beep" });
const [noAudioUsers, setNoAudioUsers] = useState({});
// Local state (not duplicated in Redux)
const [checkingMissingPeers, setCheckingMissingPeers] = useState({});
const [missingMixerPeers, setMissingMixerPeers] = useState({});
const [clientsWithAudioOverride, setClientsWithAudioOverride] = useState({});
const [vuMeterUpdatePrefMap, setVuMeterUpdatePrefMap] = useState({ count: 0 });
const [session, setSession] = useState(null);
//const [masterMixers, setMasterMixers] = useState(null);
@ -178,8 +185,8 @@ export default function useMixerStore() {
// Session management
const sessionEnded = useCallback(() => {
// Phase 5: Local state only - Redux handles noAudioUsers cleanup
setCheckingMissingPeers({});
setNoAudioUsers({});
setMissingMixerPeers({});
if (recheckTimeoutRef.current) {
clearTimeout(recheckTimeoutRef.current);

View File

@ -17,6 +17,16 @@ import {
addRecordedTrack,
setConnectionStatus
} from '../store/features/activeSessionSlice';
import {
updateMediaSummary,
setMetronome
} from '../store/features/mixersSlice';
import {
setBackingTracks,
setJamTracks,
setRecordedTracks,
updateJamTrackState
} from '../store/features/mediaSlice';
/**
* Custom hook to integrate WebSocket messages with Redux state
@ -106,6 +116,29 @@ export const useSessionWebSocket = (sessionId) => {
dispatch(addRecordedTrack(data.track));
},
// Phase 5: Mixer and Media events
MIXER_CHANGES: (sessionMixers) => {
console.log('Mixer changes received:', sessionMixers);
const session = sessionMixers.session;
const mixers = sessionMixers.mixers;
// Update media summary
if (mixers.mediaSummary) {
dispatch(updateMediaSummary(mixers.mediaSummary));
}
// Update media arrays
dispatch(setBackingTracks(mixers.backingTracks || []));
dispatch(setJamTracks(mixers.jamTracks || []));
dispatch(setRecordedTracks(mixers.recordedTracks || []));
dispatch(setMetronome(mixers.metronome || null));
},
JAM_TRACK_CHANGES: (changes) => {
console.log('Jam track changes received:', changes);
dispatch(updateJamTrackState(changes));
},
// Connection events
connectionStatusChanged: (data) => {
console.log('Connection status changed:', data);

View File

@ -11,7 +11,6 @@ import { JamServerProvider } from '../context/JamServerContext';
import { MixersProvider } from '../context/MixersContext';
import { VuProvider } from '../context/VuContext';
import { GlobalProvider } from '../context/GlobalContext';
import { MediaProvider } from '../context/MediaContext';
const JKClientLayout = ({ location }) => {
@ -27,9 +26,7 @@ const JKClientLayout = ({ location }) => {
<JamServerProvider>
<VuProvider>
<MixersProvider>
<MediaProvider>
<ClientRoutes />
</MediaProvider>
<ClientRoutes />
</MixersProvider>
</VuProvider>
</JamServerProvider>

View File

@ -0,0 +1,433 @@
import mediaReducer, {
openBackingTrack,
loadJamTrack,
closeMedia,
setBackingTracks,
setJamTracks,
setRecordedTracks,
updateJamTrackState,
clearJamTrackState,
clearAllMedia,
selectBackingTracks,
selectJamTracks,
selectRecordedTracks,
selectJamTrackState,
selectDownloadingJamTrack,
selectMediaLoading,
selectMediaError
} from '../mediaSlice';
describe('mediaSlice', () => {
const initialState = {
backingTracks: [],
jamTracks: [],
recordedTracks: [],
jamTrackState: {},
downloadingJamTrack: false,
loading: {
backingTrack: false,
jamTrack: false,
closing: false
},
error: null
};
afterEach(() => {
jest.clearAllMocks();
});
describe('reducer', () => {
it('should return the initial state', () => {
expect(mediaReducer(undefined, { type: 'unknown' })).toEqual(initialState);
});
});
describe('synchronous actions', () => {
it('should handle setBackingTracks', () => {
const tracks = [
{ id: 1, name: 'Track 1', isOpener: true },
{ id: 2, name: 'Track 2', isOpener: false }
];
const actual = mediaReducer(initialState, setBackingTracks(tracks));
expect(actual.backingTracks).toEqual(tracks);
expect(actual.backingTracks).toHaveLength(2);
});
it('should handle setJamTracks', () => {
const tracks = [
{ id: 1, name: 'Jam Track 1', part: 'drums' },
{ id: 2, name: 'Jam Track 2', part: 'bass' }
];
const actual = mediaReducer(initialState, setJamTracks(tracks));
expect(actual.jamTracks).toEqual(tracks);
});
it('should handle setRecordedTracks', () => {
const tracks = [
{ id: 1, recordingName: 'Recording 1' }
];
const actual = mediaReducer(initialState, setRecordedTracks(tracks));
expect(actual.recordedTracks).toEqual(tracks);
});
it('should handle updateJamTrackState', () => {
const actual = mediaReducer(
initialState,
updateJamTrackState({ playing: true, position: 1.5 })
);
expect(actual.jamTrackState.playing).toBe(true);
expect(actual.jamTrackState.position).toBe(1.5);
});
it('should merge jam track state updates', () => {
const stateWithJamTrack = {
...initialState,
jamTrackState: { playing: true, position: 1.5 }
};
const actual = mediaReducer(
stateWithJamTrack,
updateJamTrackState({ position: 2.0, volume: 0.8 })
);
expect(actual.jamTrackState.playing).toBe(true); // Preserved
expect(actual.jamTrackState.position).toBe(2.0); // Updated
expect(actual.jamTrackState.volume).toBe(0.8); // Added
});
it('should handle clearJamTrackState', () => {
const stateWithJamTrack = {
...initialState,
jamTrackState: { playing: true, position: 1.5 }
};
const actual = mediaReducer(stateWithJamTrack, clearJamTrackState());
expect(actual.jamTrackState).toEqual({});
});
it('should handle clearAllMedia', () => {
const stateWithMedia = {
...initialState,
backingTracks: [{ id: 1 }],
jamTracks: [{ id: 2 }],
recordedTracks: [{ id: 3 }],
jamTrackState: { playing: true },
downloadingJamTrack: true,
error: 'Some error'
};
const actual = mediaReducer(stateWithMedia, clearAllMedia());
expect(actual.backingTracks).toEqual([]);
expect(actual.jamTracks).toEqual([]);
expect(actual.recordedTracks).toEqual([]);
expect(actual.jamTrackState).toEqual({});
expect(actual.downloadingJamTrack).toBe(false);
expect(actual.error).toBeNull();
});
});
describe('async thunks: openBackingTrack', () => {
it('should handle openBackingTrack.pending', () => {
const action = { type: openBackingTrack.pending.type };
const actual = mediaReducer(initialState, action);
expect(actual.loading.backingTrack).toBe(true);
expect(actual.error).toBeNull();
});
it('should handle openBackingTrack.fulfilled', () => {
const stateWithLoading = {
...initialState,
loading: { ...initialState.loading, backingTrack: true }
};
const action = {
type: openBackingTrack.fulfilled.type,
payload: { file: 'track.mp3' }
};
const actual = mediaReducer(stateWithLoading, action);
expect(actual.loading.backingTrack).toBe(false);
});
it('should handle openBackingTrack.rejected', () => {
const stateWithLoading = {
...initialState,
loading: { ...initialState.loading, backingTrack: true }
};
const action = {
type: openBackingTrack.rejected.type,
payload: 'Failed to open backing track'
};
const actual = mediaReducer(stateWithLoading, action);
expect(actual.loading.backingTrack).toBe(false);
expect(actual.error).toBe('Failed to open backing track');
});
it('should call jamClient.SessionOpenBackingTrackFile', async () => {
const mockJamClient = {
SessionOpenBackingTrackFile: jest.fn().mockResolvedValue()
};
const dispatch = jest.fn();
const thunk = openBackingTrack({
file: 'test.mp3',
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.SessionOpenBackingTrackFile).toHaveBeenCalledWith('test.mp3', false);
});
});
describe('async thunks: loadJamTrack', () => {
it('should handle loadJamTrack.pending', () => {
const action = { type: loadJamTrack.pending.type };
const actual = mediaReducer(initialState, action);
expect(actual.loading.jamTrack).toBe(true);
expect(actual.downloadingJamTrack).toBe(true);
expect(actual.error).toBeNull();
});
it('should handle loadJamTrack.fulfilled', () => {
const stateWithLoading = {
...initialState,
loading: { ...initialState.loading, jamTrack: true },
downloadingJamTrack: true
};
const action = {
type: loadJamTrack.fulfilled.type,
payload: { jamTrack: { id: 123 }, result: true }
};
const actual = mediaReducer(stateWithLoading, action);
expect(actual.loading.jamTrack).toBe(false);
expect(actual.downloadingJamTrack).toBe(false);
});
it('should handle loadJamTrack.rejected', () => {
const stateWithLoading = {
...initialState,
loading: { ...initialState.loading, jamTrack: true },
downloadingJamTrack: true
};
const action = {
type: loadJamTrack.rejected.type,
payload: 'Unable to open JamTrack'
};
const actual = mediaReducer(stateWithLoading, action);
expect(actual.loading.jamTrack).toBe(false);
expect(actual.downloadingJamTrack).toBe(false);
expect(actual.error).toBe('Unable to open JamTrack');
});
it('should call jamClient.JamTrackPlay without JMep', async () => {
const mockJamClient = {
JamTrackPlay: jest.fn().mockResolvedValue(true)
};
const dispatch = jest.fn();
const thunk = loadJamTrack({
jamTrack: { id: 123, name: 'Test Track' },
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.JamTrackPlay).toHaveBeenCalledWith(123);
});
it('should call jamClient.JamTrackLoadJmep when JMep data available', async () => {
const mockJamClient = {
GetSampleRate: jest.fn().mockResolvedValue(48),
JamTrackLoadJmep: jest.fn().mockResolvedValue(),
JamTrackPlay: jest.fn().mockResolvedValue(true)
};
const dispatch = jest.fn();
const thunk = loadJamTrack({
jamTrack: { id: 123, name: 'Test Track', jmep: 'base64data...' },
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.GetSampleRate).toHaveBeenCalled();
expect(mockJamClient.JamTrackLoadJmep).toHaveBeenCalledWith('123-48', 'base64data...');
expect(mockJamClient.JamTrackPlay).toHaveBeenCalledWith(123);
});
it('should use 44 sample rate filename when sample rate is not 48', async () => {
const mockJamClient = {
GetSampleRate: jest.fn().mockResolvedValue(44.1),
JamTrackLoadJmep: jest.fn().mockResolvedValue(),
JamTrackPlay: jest.fn().mockResolvedValue(true)
};
const dispatch = jest.fn();
const thunk = loadJamTrack({
jamTrack: { id: 456, jmep: 'base64data...' },
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.JamTrackLoadJmep).toHaveBeenCalledWith('456-44', 'base64data...');
});
it('should reject when JamTrackPlay returns false', async () => {
const mockJamClient = {
JamTrackPlay: jest.fn().mockResolvedValue(false)
};
const dispatch = jest.fn();
const getState = jest.fn();
const thunk = loadJamTrack({
jamTrack: { id: 123 },
jamClient: mockJamClient
});
const result = await thunk(dispatch, getState, undefined);
expect(result.type).toBe('media/loadJamTrack/rejected');
expect(result.payload).toBe('Unable to open JamTrack');
});
});
describe('async thunks: closeMedia', () => {
it('should handle closeMedia.pending', () => {
const action = { type: closeMedia.pending.type };
const actual = mediaReducer(initialState, action);
expect(actual.loading.closing).toBe(true);
});
it('should handle closeMedia.fulfilled and clear all media', () => {
const stateWithMedia = {
...initialState,
backingTracks: [{ id: 1 }],
jamTracks: [{ id: 2 }],
recordedTracks: [{ id: 3 }],
jamTrackState: { playing: true },
loading: { ...initialState.loading, closing: true }
};
const action = { type: closeMedia.fulfilled.type };
const actual = mediaReducer(stateWithMedia, action);
expect(actual.loading.closing).toBe(false);
expect(actual.backingTracks).toEqual([]);
expect(actual.jamTracks).toEqual([]);
expect(actual.recordedTracks).toEqual([]);
expect(actual.jamTrackState).toEqual({});
});
it('should handle closeMedia.rejected', () => {
const stateWithLoading = {
...initialState,
loading: { ...initialState.loading, closing: true }
};
const action = {
type: closeMedia.rejected.type,
payload: 'Failed to close media'
};
const actual = mediaReducer(stateWithLoading, action);
expect(actual.loading.closing).toBe(false);
expect(actual.error).toBe('Failed to close media');
});
it('should call jamClient.SessionCloseMedia with force parameter', async () => {
const mockJamClient = {
SessionCloseMedia: jest.fn().mockResolvedValue()
};
const dispatch = jest.fn();
const thunk = closeMedia({
force: true,
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.SessionCloseMedia).toHaveBeenCalledWith(true);
});
it('should default force to false', async () => {
const mockJamClient = {
SessionCloseMedia: jest.fn().mockResolvedValue()
};
const dispatch = jest.fn();
const thunk = closeMedia({
jamClient: mockJamClient
});
await thunk(dispatch, () => ({}), undefined);
expect(mockJamClient.SessionCloseMedia).toHaveBeenCalledWith(false);
});
});
describe('selectors', () => {
const mockState = {
media: {
backingTracks: [{ id: 1, name: 'Backing 1' }],
jamTracks: [{ id: 2, name: 'Jam 1' }],
recordedTracks: [{ id: 3, name: 'Recording 1' }],
jamTrackState: { playing: true, position: 2.5 },
downloadingJamTrack: true,
loading: {
backingTrack: false,
jamTrack: true,
closing: false
},
error: 'Test error'
}
};
it('should select backing tracks', () => {
expect(selectBackingTracks(mockState)).toEqual([{ id: 1, name: 'Backing 1' }]);
});
it('should select jam tracks', () => {
expect(selectJamTracks(mockState)).toEqual([{ id: 2, name: 'Jam 1' }]);
});
it('should select recorded tracks', () => {
expect(selectRecordedTracks(mockState)).toEqual([{ id: 3, name: 'Recording 1' }]);
});
it('should select jam track state', () => {
expect(selectJamTrackState(mockState)).toEqual({ playing: true, position: 2.5 });
});
it('should select downloading jam track flag', () => {
expect(selectDownloadingJamTrack(mockState)).toBe(true);
});
it('should select media loading states', () => {
expect(selectMediaLoading(mockState)).toEqual({
backingTrack: false,
jamTrack: true,
closing: false
});
});
it('should select media error', () => {
expect(selectMediaError(mockState)).toBe('Test error');
});
});
});

View File

@ -0,0 +1,533 @@
import mixersReducer, {
setMasterMixers,
setPersonalMixers,
organizeMixers,
setRecordingTrackMixers,
setBackingTrackMixers,
setJamTrackMixers,
setMetronomeTrackMixers,
setAdhocTrackMixers,
updateMixer,
setMetronome,
setMetronomeSettings,
updateMediaSummary,
setMediaSummary,
setChatMixer,
setBroadcastMixer,
setRecordingMixer,
setSimulatedMusicCategoryMixers,
setSimulatedChatCategoryMixers,
addNoAudioUser,
removeNoAudioUser,
setNoAudioUsers,
setClientsWithAudioOverride,
setMissingMixerPeers,
setCheckingMissingPeers,
clearMixers,
selectAllMixers,
selectMasterMixers,
selectPersonalMixers,
selectMixersByResourceId,
selectMixersByTrackId,
selectChatMixer,
selectBroadcastMixer,
selectRecordingMixer,
selectMetronome,
selectMetronomeSettings,
selectMediaSummary,
selectMixersReady,
selectBackingTrackMixers,
selectJamTrackMixers,
selectRecordingTrackMixers,
selectMetronomeTrackMixers,
selectAdhocTrackMixers,
selectSimulatedMusicCategoryMixers,
selectSimulatedChatCategoryMixers,
selectNoAudioUsers,
selectClientsWithAudioOverride,
selectMissingMixerPeers,
selectCheckingMissingPeers,
selectMixer,
selectMixerPairByResourceId,
selectMixerPairByTrackId
} from '../mixersSlice';
describe('mixersSlice', () => {
const initialState = {
chatMixer: null,
broadcastMixer: null,
recordingMixer: null,
recordingTrackMixers: [],
backingTrackMixers: [],
jamTrackMixers: [],
metronomeTrackMixers: [],
adhocTrackMixers: [],
masterMixers: [],
personalMixers: [],
allMixers: {},
mixersByResourceId: {},
mixersByTrackId: {},
simulatedMusicCategoryMixers: {
PERSONAL: null,
MASTER: null
},
simulatedChatCategoryMixers: {
PERSONAL: null,
MASTER: null
},
metronome: null,
metronomeSettings: {
tempo: 120,
sound: "Beep",
cricket: false
},
mediaSummary: {
mediaOpen: false,
backingTrackOpen: false,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
isOpener: false,
userNeedsMediaControls: false,
jamTrack: null
},
noAudioUsers: {},
clientsWithAudioOverride: {},
missingMixerPeers: {},
checkingMissingPeers: {},
isReady: false,
loading: false,
error: null
};
afterEach(() => {
jest.clearAllMocks();
});
describe('reducer', () => {
it('should return the initial state', () => {
expect(mixersReducer(undefined, { type: 'unknown' })).toEqual(initialState);
});
});
describe('master and personal mixers', () => {
it('should handle setMasterMixers', () => {
const masterMixers = [
{ id: 1, rid: 'rid1', name: 'Master Mixer 1' },
{ id: 2, rid: 'rid2', name: 'Master Mixer 2' }
];
const actual = mixersReducer(initialState, setMasterMixers(masterMixers));
expect(actual.masterMixers).toEqual(masterMixers);
expect(actual.masterMixers).toHaveLength(2);
});
it('should handle setPersonalMixers', () => {
const personalMixers = [
{ id: 1, rid: 'rid1', name: 'Personal Mixer 1' },
{ id: 2, rid: 'rid2', name: 'Personal Mixer 2' }
];
const actual = mixersReducer(initialState, setPersonalMixers(personalMixers));
expect(actual.personalMixers).toEqual(personalMixers);
expect(actual.personalMixers).toHaveLength(2);
});
});
describe('organizeMixers', () => {
it('should organize mixers and build lookup tables', () => {
const stateWithMixers = {
...initialState,
masterMixers: [
{ id: 1, rid: 'rid1', name: 'Master 1' },
{ id: 2, rid: 'rid2', name: 'Master 2' }
],
personalMixers: [
{ id: 1, rid: 'rid1', name: 'Personal 1' },
{ id: 2, rid: 'rid2', name: 'Personal 2' }
]
};
const actual = mixersReducer(stateWithMixers, organizeMixers());
// Check allMixers
expect(actual.allMixers['M1']).toEqual({ id: 1, rid: 'rid1', name: 'Master 1' });
expect(actual.allMixers['M2']).toEqual({ id: 2, rid: 'rid2', name: 'Master 2' });
expect(actual.allMixers['P1']).toEqual({ id: 1, rid: 'rid1', name: 'Personal 1' });
expect(actual.allMixers['P2']).toEqual({ id: 2, rid: 'rid2', name: 'Personal 2' });
// Check mixersByResourceId
expect(actual.mixersByResourceId['rid1'].master).toEqual({ id: 1, rid: 'rid1', name: 'Master 1' });
expect(actual.mixersByResourceId['rid1'].personal).toEqual({ id: 1, rid: 'rid1', name: 'Personal 1' });
expect(actual.mixersByResourceId['rid2'].master).toEqual({ id: 2, rid: 'rid2', name: 'Master 2' });
expect(actual.mixersByResourceId['rid2'].personal).toEqual({ id: 2, rid: 'rid2', name: 'Personal 2' });
// Check mixersByTrackId
expect(actual.mixersByTrackId[1].master).toEqual({ id: 1, rid: 'rid1', name: 'Master 1' });
expect(actual.mixersByTrackId[1].personal).toEqual({ id: 1, rid: 'rid1', name: 'Personal 1' });
// Check isReady flag
expect(actual.isReady).toBe(true);
});
it('should handle personal mixer without matching master', () => {
const stateWithMixers = {
...initialState,
masterMixers: [
{ id: 1, rid: 'rid1', name: 'Master 1' }
],
personalMixers: [
{ id: 1, rid: 'rid1', name: 'Personal 1' },
{ id: 2, rid: 'rid2', name: 'Personal 2 (no master)' }
]
};
const actual = mixersReducer(stateWithMixers, organizeMixers());
// Personal mixer without master should still be added
expect(actual.allMixers['P2']).toEqual({ id: 2, rid: 'rid2', name: 'Personal 2 (no master)' });
expect(actual.mixersByResourceId['rid2'].personal).toEqual({ id: 2, rid: 'rid2', name: 'Personal 2 (no master)' });
expect(actual.mixersByResourceId['rid2'].master).toBeUndefined();
});
});
describe('mixer type arrays', () => {
it('should handle setRecordingTrackMixers', () => {
const mixers = [{ id: 1, name: 'Recording Track 1' }];
const actual = mixersReducer(initialState, setRecordingTrackMixers(mixers));
expect(actual.recordingTrackMixers).toEqual(mixers);
});
it('should handle setBackingTrackMixers', () => {
const mixers = [{ id: 2, name: 'Backing Track 1' }];
const actual = mixersReducer(initialState, setBackingTrackMixers(mixers));
expect(actual.backingTrackMixers).toEqual(mixers);
});
it('should handle setJamTrackMixers', () => {
const mixers = [{ id: 3, name: 'Jam Track 1' }];
const actual = mixersReducer(initialState, setJamTrackMixers(mixers));
expect(actual.jamTrackMixers).toEqual(mixers);
});
it('should handle setMetronomeTrackMixers', () => {
const mixers = [{ id: 4, name: 'Metronome Track 1' }];
const actual = mixersReducer(initialState, setMetronomeTrackMixers(mixers));
expect(actual.metronomeTrackMixers).toEqual(mixers);
});
it('should handle setAdhocTrackMixers', () => {
const mixers = [{ id: 5, name: 'Adhoc Track 1' }];
const actual = mixersReducer(initialState, setAdhocTrackMixers(mixers));
expect(actual.adhocTrackMixers).toEqual(mixers);
});
});
describe('updateMixer', () => {
it('should update master mixer in allMixers and masterMixers arrays', () => {
const stateWithMixers = {
...initialState,
masterMixers: [
{ id: 1, rid: 'rid1', volume: 0.5 },
{ id: 2, rid: 'rid2', volume: 0.7 }
],
allMixers: {
'M1': { id: 1, rid: 'rid1', volume: 0.5 },
'M2': { id: 2, rid: 'rid2', volume: 0.7 }
}
};
const actual = mixersReducer(
stateWithMixers,
updateMixer({ mixerId: 1, mode: true, updates: { volume: 0.8 } })
);
expect(actual.allMixers['M1'].volume).toBe(0.8);
expect(actual.masterMixers[0].volume).toBe(0.8);
});
it('should update personal mixer in allMixers and personalMixers arrays', () => {
const stateWithMixers = {
...initialState,
personalMixers: [
{ id: 1, rid: 'rid1', volume: 0.5 },
{ id: 2, rid: 'rid2', volume: 0.7 }
],
allMixers: {
'P1': { id: 1, rid: 'rid1', volume: 0.5 },
'P2': { id: 2, rid: 'rid2', volume: 0.7 }
}
};
const actual = mixersReducer(
stateWithMixers,
updateMixer({ mixerId: 1, mode: false, updates: { volume: 0.9 } })
);
expect(actual.allMixers['P1'].volume).toBe(0.9);
expect(actual.personalMixers[0].volume).toBe(0.9);
});
it('should not fail when updating non-existent mixer', () => {
const actual = mixersReducer(
initialState,
updateMixer({ mixerId: 999, mode: true, updates: { volume: 0.5 } })
);
expect(actual).toEqual(initialState);
});
});
describe('metronome', () => {
it('should handle setMetronome with metronome object', () => {
const metronome = { bpm: 120, sound: 'Beep', meter: 1, mode: 0 };
const actual = mixersReducer(initialState, setMetronome(metronome));
expect(actual.metronome).toEqual(metronome);
expect(actual.mediaSummary.metronomeOpen).toBe(true);
});
it('should handle setMetronome with null (close metronome)', () => {
const stateWithMetronome = {
...initialState,
metronome: { bpm: 120 },
mediaSummary: { ...initialState.mediaSummary, metronomeOpen: true }
};
const actual = mixersReducer(stateWithMetronome, setMetronome(null));
expect(actual.metronome).toBeNull();
expect(actual.mediaSummary.metronomeOpen).toBe(false);
});
it('should handle setMetronomeSettings', () => {
const actual = mixersReducer(
initialState,
setMetronomeSettings({ tempo: 140, sound: 'Click', cricket: true })
);
expect(actual.metronomeSettings.tempo).toBe(140);
expect(actual.metronomeSettings.sound).toBe('Click');
expect(actual.metronomeSettings.cricket).toBe(true);
});
it('should handle partial metronome settings update', () => {
const actual = mixersReducer(
initialState,
setMetronomeSettings({ tempo: 160 })
);
expect(actual.metronomeSettings.tempo).toBe(160);
expect(actual.metronomeSettings.sound).toBe('Beep'); // Unchanged
expect(actual.metronomeSettings.cricket).toBe(false); // Unchanged
});
});
describe('media summary', () => {
it('should handle updateMediaSummary (partial update)', () => {
const actual = mixersReducer(
initialState,
updateMediaSummary({ backingTrackOpen: true, userNeedsMediaControls: true })
);
expect(actual.mediaSummary.backingTrackOpen).toBe(true);
expect(actual.mediaSummary.userNeedsMediaControls).toBe(true);
expect(actual.mediaSummary.jamTrackOpen).toBe(false); // Unchanged
});
it('should handle setMediaSummary (full replacement)', () => {
const newMediaSummary = {
mediaOpen: true,
backingTrackOpen: true,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
isOpener: true,
userNeedsMediaControls: true,
jamTrack: { id: 123, name: 'Test Track' }
};
const actual = mixersReducer(initialState, setMediaSummary(newMediaSummary));
expect(actual.mediaSummary).toEqual(newMediaSummary);
});
});
describe('chat, broadcast, recording mixers', () => {
it('should handle setChatMixer', () => {
const chatMixer = { id: 1, name: 'Chat Mixer' };
const actual = mixersReducer(initialState, setChatMixer(chatMixer));
expect(actual.chatMixer).toEqual(chatMixer);
});
it('should handle setBroadcastMixer', () => {
const broadcastMixer = { id: 2, name: 'Broadcast Mixer' };
const actual = mixersReducer(initialState, setBroadcastMixer(broadcastMixer));
expect(actual.broadcastMixer).toEqual(broadcastMixer);
});
it('should handle setRecordingMixer', () => {
const recordingMixer = { id: 3, name: 'Recording Mixer' };
const actual = mixersReducer(initialState, setRecordingMixer(recordingMixer));
expect(actual.recordingMixer).toEqual(recordingMixer);
});
});
describe('simulated category mixers', () => {
it('should handle setSimulatedMusicCategoryMixers', () => {
const mixers = { PERSONAL: { volume: 0.8 }, MASTER: { volume: 0.9 } };
const actual = mixersReducer(initialState, setSimulatedMusicCategoryMixers(mixers));
expect(actual.simulatedMusicCategoryMixers).toEqual(mixers);
});
it('should handle setSimulatedChatCategoryMixers', () => {
const mixers = { PERSONAL: { volume: 0.7 }, MASTER: { volume: 0.8 } };
const actual = mixersReducer(initialState, setSimulatedChatCategoryMixers(mixers));
expect(actual.simulatedChatCategoryMixers).toEqual(mixers);
});
});
describe('peer management', () => {
it('should handle addNoAudioUser', () => {
const actual = mixersReducer(initialState, addNoAudioUser('user123'));
expect(actual.noAudioUsers['user123']).toBe(true);
});
it('should handle removeNoAudioUser', () => {
const stateWithNoAudioUser = {
...initialState,
noAudioUsers: { 'user123': true, 'user456': true }
};
const actual = mixersReducer(stateWithNoAudioUser, removeNoAudioUser('user123'));
expect(actual.noAudioUsers['user123']).toBeUndefined();
expect(actual.noAudioUsers['user456']).toBe(true);
});
it('should handle setNoAudioUsers', () => {
const noAudioUsers = { 'user1': true, 'user2': true };
const actual = mixersReducer(initialState, setNoAudioUsers(noAudioUsers));
expect(actual.noAudioUsers).toEqual(noAudioUsers);
});
it('should handle setClientsWithAudioOverride', () => {
const clients = { 'client1': true, 'client2': false };
const actual = mixersReducer(initialState, setClientsWithAudioOverride(clients));
expect(actual.clientsWithAudioOverride).toEqual(clients);
});
it('should handle setMissingMixerPeers', () => {
const peers = { 'peer1': true };
const actual = mixersReducer(initialState, setMissingMixerPeers(peers));
expect(actual.missingMixerPeers).toEqual(peers);
});
it('should handle setCheckingMissingPeers', () => {
const checking = { 'peer1': true };
const actual = mixersReducer(initialState, setCheckingMissingPeers(checking));
expect(actual.checkingMissingPeers).toEqual(checking);
});
});
describe('clearMixers', () => {
it('should reset to initial state', () => {
const stateWithData = {
...initialState,
chatMixer: { id: 1 },
masterMixers: [{ id: 1 }],
personalMixers: [{ id: 2 }],
allMixers: { 'M1': { id: 1 } },
isReady: true,
metronome: { bpm: 120 }
};
const actual = mixersReducer(stateWithData, clearMixers());
expect(actual).toEqual(initialState);
});
});
describe('selectors', () => {
const mockState = {
mixers: {
...initialState,
chatMixer: { id: 1, name: 'Chat' },
broadcastMixer: { id: 2, name: 'Broadcast' },
masterMixers: [{ id: 1 }],
personalMixers: [{ id: 2 }],
allMixers: { 'M1': { id: 1 }, 'P2': { id: 2 } },
mixersByResourceId: { 'rid1': { master: { id: 1 }, personal: { id: 2 } } },
mixersByTrackId: { '1': { master: { id: 1 }, personal: { id: 2 } } },
metronome: { bpm: 120 },
metronomeSettings: { tempo: 140, sound: 'Click', cricket: false },
mediaSummary: { backingTrackOpen: true },
isReady: true,
backingTrackMixers: [{ id: 3 }],
noAudioUsers: { 'user1': true }
}
};
it('should select all mixers', () => {
expect(selectAllMixers(mockState)).toEqual({ 'M1': { id: 1 }, 'P2': { id: 2 } });
});
it('should select master mixers', () => {
expect(selectMasterMixers(mockState)).toEqual([{ id: 1 }]);
});
it('should select personal mixers', () => {
expect(selectPersonalMixers(mockState)).toEqual([{ id: 2 }]);
});
it('should select mixers by resource ID', () => {
expect(selectMixersByResourceId(mockState)).toEqual({ 'rid1': { master: { id: 1 }, personal: { id: 2 } } });
});
it('should select mixers by track ID', () => {
expect(selectMixersByTrackId(mockState)).toEqual({ '1': { master: { id: 1 }, personal: { id: 2 } } });
});
it('should select chat mixer', () => {
expect(selectChatMixer(mockState)).toEqual({ id: 1, name: 'Chat' });
});
it('should select broadcast mixer', () => {
expect(selectBroadcastMixer(mockState)).toEqual({ id: 2, name: 'Broadcast' });
});
it('should select metronome', () => {
expect(selectMetronome(mockState)).toEqual({ bpm: 120 });
});
it('should select metronome settings', () => {
expect(selectMetronomeSettings(mockState)).toEqual({ tempo: 140, sound: 'Click', cricket: false });
});
it('should select media summary', () => {
expect(selectMediaSummary(mockState)).toEqual({ backingTrackOpen: true });
});
it('should select mixers ready flag', () => {
expect(selectMixersReady(mockState)).toBe(true);
});
it('should select backing track mixers', () => {
expect(selectBackingTrackMixers(mockState)).toEqual([{ id: 3 }]);
});
it('should select no audio users', () => {
expect(selectNoAudioUsers(mockState)).toEqual({ 'user1': true });
});
it('should select specific mixer by ID and mode', () => {
expect(selectMixer(1, true)(mockState)).toEqual({ id: 1 });
expect(selectMixer(2, false)(mockState)).toEqual({ id: 2 });
});
it('should select mixer pair by resource ID', () => {
expect(selectMixerPairByResourceId('rid1')(mockState)).toEqual({ master: { id: 1 }, personal: { id: 2 } });
});
it('should select mixer pair by track ID', () => {
expect(selectMixerPairByTrackId('1')(mockState)).toEqual({ master: { id: 1 }, personal: { id: 2 } });
});
});
});

View File

@ -10,13 +10,36 @@ import sessionUIReducer, {
toggleParticipantVideos,
setShowMixer,
toggleMixer,
// Phase 5: Media UI actions
setShowMyMixes,
toggleMyMixes,
setShowCustomMixes,
toggleCustomMixes,
setEditingMixdownId,
setCreatingMixdown,
setCreateMixdownErrors,
// Phase 5: Mixer UI actions
setMixMode,
toggleMixMode,
setCurrentMixerRange,
resetUI,
selectModal,
selectAllModals,
selectIsAnyModalOpen,
selectPanel,
selectParticipantLayout,
selectShowMixer
selectShowMixer,
// Phase 5: Media UI selectors
selectShowMyMixes,
selectShowCustomMixes,
selectEditingMixdownId,
selectCreatingMixdown,
selectCreateMixdownErrors,
selectMediaUI,
// Phase 5: Mixer UI selectors
selectMixMode,
selectCurrentMixerRange,
selectMixerUI
} from '../sessionUISlice';
describe('sessionUISlice', () => {
@ -44,6 +67,22 @@ describe('sessionUISlice', () => {
showParticipantVideos: true,
showMixer: false,
sidebarCollapsed: false
},
// Phase 5: Media UI state
mediaUI: {
showMyMixes: false,
showCustomMixes: false,
editingMixdownId: null,
creatingMixdown: false,
createMixdownErrors: null
},
// Phase 5: Mixer UI state
mixerUI: {
mixMode: 'PERSONAL',
currentMixerRange: {
min: null,
max: null
}
}
};
@ -281,4 +320,179 @@ describe('sessionUISlice', () => {
expect(selectShowMixer(mockState)).toBe(true);
});
});
// Phase 5: Media UI tests
describe('media UI actions', () => {
it('should handle setShowMyMixes', () => {
const actual = sessionUIReducer(initialState, setShowMyMixes(true));
expect(actual.mediaUI.showMyMixes).toBe(true);
});
it('should handle toggleMyMixes', () => {
const actual1 = sessionUIReducer(initialState, toggleMyMixes());
expect(actual1.mediaUI.showMyMixes).toBe(true);
const actual2 = sessionUIReducer(actual1, toggleMyMixes());
expect(actual2.mediaUI.showMyMixes).toBe(false);
});
it('should handle setShowCustomMixes', () => {
const actual = sessionUIReducer(initialState, setShowCustomMixes(true));
expect(actual.mediaUI.showCustomMixes).toBe(true);
});
it('should handle toggleCustomMixes', () => {
const actual1 = sessionUIReducer(initialState, toggleCustomMixes());
expect(actual1.mediaUI.showCustomMixes).toBe(true);
const actual2 = sessionUIReducer(actual1, toggleCustomMixes());
expect(actual2.mediaUI.showCustomMixes).toBe(false);
});
it('should handle setEditingMixdownId', () => {
const actual = sessionUIReducer(initialState, setEditingMixdownId('mixdown-123'));
expect(actual.mediaUI.editingMixdownId).toBe('mixdown-123');
});
it('should handle setCreatingMixdown', () => {
const actual = sessionUIReducer(initialState, setCreatingMixdown(true));
expect(actual.mediaUI.creatingMixdown).toBe(true);
});
it('should handle setCreateMixdownErrors', () => {
const errors = { name: 'Name is required' };
const actual = sessionUIReducer(initialState, setCreateMixdownErrors(errors));
expect(actual.mediaUI.createMixdownErrors).toEqual(errors);
});
it('should clear mixdown errors with null', () => {
const stateWithErrors = {
...initialState,
mediaUI: { ...initialState.mediaUI, createMixdownErrors: { name: 'Error' } }
};
const actual = sessionUIReducer(stateWithErrors, setCreateMixdownErrors(null));
expect(actual.mediaUI.createMixdownErrors).toBeNull();
});
});
// Phase 5: Mixer UI tests
describe('mixer UI actions', () => {
it('should handle setMixMode', () => {
const actual = sessionUIReducer(initialState, setMixMode('MASTER'));
expect(actual.mixerUI.mixMode).toBe('MASTER');
});
it('should handle toggleMixMode from PERSONAL to MASTER', () => {
const actual = sessionUIReducer(initialState, toggleMixMode());
expect(actual.mixerUI.mixMode).toBe('MASTER');
});
it('should handle toggleMixMode from MASTER to PERSONAL', () => {
const stateWithMaster = {
...initialState,
mixerUI: { ...initialState.mixerUI, mixMode: 'MASTER' }
};
const actual = sessionUIReducer(stateWithMaster, toggleMixMode());
expect(actual.mixerUI.mixMode).toBe('PERSONAL');
});
it('should handle setCurrentMixerRange', () => {
const range = { min: 0, max: 100 };
const actual = sessionUIReducer(initialState, setCurrentMixerRange(range));
expect(actual.mixerUI.currentMixerRange).toEqual(range);
});
it('should handle partial mixer range update', () => {
const stateWithRange = {
...initialState,
mixerUI: { ...initialState.mixerUI, currentMixerRange: { min: 10, max: 90 } }
};
const actual = sessionUIReducer(stateWithRange, setCurrentMixerRange({ min: 10, max: 90 }));
expect(actual.mixerUI.currentMixerRange.min).toBe(10);
expect(actual.mixerUI.currentMixerRange.max).toBe(90);
});
});
// Phase 5: Extended resetUI test
describe('resetUI with Phase 5 state', () => {
it('should reset media UI and mixer UI to initial state', () => {
const modifiedState = {
...initialState,
mediaUI: {
showMyMixes: true,
showCustomMixes: true,
editingMixdownId: 'mix-456',
creatingMixdown: true,
createMixdownErrors: { error: 'Some error' }
},
mixerUI: {
mixMode: 'MASTER',
currentMixerRange: { min: 10, max: 90 }
}
};
const actual = sessionUIReducer(modifiedState, resetUI());
expect(actual.mediaUI).toEqual(initialState.mediaUI);
expect(actual.mixerUI).toEqual(initialState.mixerUI);
});
});
// Phase 5: Media UI selectors
describe('media UI selectors', () => {
const mockStateWithPhase5 = {
sessionUI: {
...initialState,
mediaUI: {
showMyMixes: true,
showCustomMixes: false,
editingMixdownId: 'mix-789',
creatingMixdown: true,
createMixdownErrors: { name: 'Required' }
},
mixerUI: {
mixMode: 'MASTER',
currentMixerRange: { min: 20, max: 80 }
}
}
};
it('selectShowMyMixes should return correct value', () => {
expect(selectShowMyMixes(mockStateWithPhase5)).toBe(true);
});
it('selectShowCustomMixes should return correct value', () => {
expect(selectShowCustomMixes(mockStateWithPhase5)).toBe(false);
});
it('selectEditingMixdownId should return correct value', () => {
expect(selectEditingMixdownId(mockStateWithPhase5)).toBe('mix-789');
});
it('selectCreatingMixdown should return correct value', () => {
expect(selectCreatingMixdown(mockStateWithPhase5)).toBe(true);
});
it('selectCreateMixdownErrors should return correct value', () => {
expect(selectCreateMixdownErrors(mockStateWithPhase5)).toEqual({ name: 'Required' });
});
it('selectMediaUI should return entire mediaUI object', () => {
expect(selectMediaUI(mockStateWithPhase5)).toEqual(mockStateWithPhase5.sessionUI.mediaUI);
});
it('selectMixMode should return correct mix mode', () => {
expect(selectMixMode(mockStateWithPhase5)).toBe('MASTER');
});
it('selectCurrentMixerRange should return correct range', () => {
expect(selectCurrentMixerRange(mockStateWithPhase5)).toEqual({ min: 20, max: 80 });
});
it('selectMixerUI should return entire mixerUI object', () => {
expect(selectMixerUI(mockStateWithPhase5)).toEqual(mockStateWithPhase5.sessionUI.mixerUI);
});
});
});

View File

@ -0,0 +1,183 @@
import { createSlice, createAsyncThunk } from '@reduxjs/toolkit';
// Async thunks for media actions
export const openBackingTrack = createAsyncThunk(
'media/openBackingTrack',
async ({ file, jamClient }, { rejectWithValue }) => {
try {
await jamClient.SessionOpenBackingTrackFile(file, false);
return { file };
} catch (error) {
return rejectWithValue(error.message);
}
}
);
export const loadJamTrack = createAsyncThunk(
'media/loadJamTrack',
async ({ jamTrack, jamClient }, { rejectWithValue }) => {
try {
// Load JMep data if available (matches MediaContext:163-170 logic)
if (jamTrack.jmep) {
const sampleRate = await jamClient.GetSampleRate();
const sampleRateForFilename = sampleRate === 48 ? '48' : '44';
const fqId = `${jamTrack.id}-${sampleRateForFilename}`;
await jamClient.JamTrackLoadJmep(fqId, jamTrack.jmep);
}
// Play/load the jamtrack
const result = await jamClient.JamTrackPlay(jamTrack.id);
if (!result) {
throw new Error('Unable to open JamTrack');
}
return { jamTrack, result };
} catch (error) {
return rejectWithValue(error.message);
}
}
);
export const closeMedia = createAsyncThunk(
'media/closeMedia',
async ({ force = false, jamClient }, { rejectWithValue }) => {
try {
await jamClient.SessionCloseMedia(force);
return null;
} catch (error) {
return rejectWithValue(error.message);
}
}
);
const initialState = {
// Media arrays (resolved/enriched data from mixer system)
// Format matches useMixerHelper resolved data structures
backingTracks: [], // [{ isOpener, shortFilename, instrumentIcon, photoUrl, showLoop, track, mixers }]
jamTracks: [], // [{ name, trackName, part, isOpener, instrumentIcon, track, mixers }]
recordedTracks: [], // [{ recordingName, isOpener, userName, instrumentIcon, track, mixers }]
// JamTrack state (real-time playback state from native client)
jamTrackState: {}, // Real-time jamTrack playback state from WebSocket JAM_TRACK_CHANGES
downloadingJamTrack: false,
// Loading states for async operations
loading: {
backingTrack: false,
jamTrack: false,
closing: false
},
error: null
};
export const mediaSlice = createSlice({
name: 'media',
initialState,
reducers: {
// Set resolved media data (called from useMixerHelper after processing)
setBackingTracks: (state, action) => {
state.backingTracks = action.payload;
},
setJamTracks: (state, action) => {
state.jamTracks = action.payload;
},
setRecordedTracks: (state, action) => {
state.recordedTracks = action.payload;
},
// JamTrack state updates (from WebSocket JAM_TRACK_CHANGES messages)
updateJamTrackState: (state, action) => {
state.jamTrackState = { ...state.jamTrackState, ...action.payload };
},
clearJamTrackState: (state) => {
state.jamTrackState = {};
},
// Clear all media (called on session end or media close)
clearAllMedia: (state) => {
state.backingTracks = [];
state.jamTracks = [];
state.recordedTracks = [];
state.jamTrackState = {};
state.downloadingJamTrack = false;
state.error = null;
}
},
extraReducers: (builder) => {
builder
// Open backing track
.addCase(openBackingTrack.pending, (state) => {
state.loading.backingTrack = true;
state.error = null;
})
.addCase(openBackingTrack.fulfilled, (state) => {
state.loading.backingTrack = false;
// Note: backingTracks array updated by MIXER_CHANGES WebSocket message
// which triggers useMixerHelper to resolve and dispatch setBackingTracks
})
.addCase(openBackingTrack.rejected, (state, action) => {
state.loading.backingTrack = false;
state.error = action.payload;
})
// Load jam track
.addCase(loadJamTrack.pending, (state) => {
state.loading.jamTrack = true;
state.downloadingJamTrack = true;
state.error = null;
})
.addCase(loadJamTrack.fulfilled, (state) => {
state.loading.jamTrack = false;
state.downloadingJamTrack = false;
// Note: jamTracks array updated by MIXER_CHANGES WebSocket message
})
.addCase(loadJamTrack.rejected, (state, action) => {
state.loading.jamTrack = false;
state.downloadingJamTrack = false;
state.error = action.payload;
})
// Close media
.addCase(closeMedia.pending, (state) => {
state.loading.closing = true;
})
.addCase(closeMedia.fulfilled, (state) => {
state.loading.closing = false;
// Clear all media data
state.backingTracks = [];
state.jamTracks = [];
state.recordedTracks = [];
state.jamTrackState = {};
})
.addCase(closeMedia.rejected, (state, action) => {
state.loading.closing = false;
state.error = action.payload;
});
}
});
export const {
setBackingTracks,
setJamTracks,
setRecordedTracks,
updateJamTrackState,
clearJamTrackState,
clearAllMedia
} = mediaSlice.actions;
export default mediaSlice.reducer;
// Selectors
export const selectBackingTracks = (state) => state.media.backingTracks;
export const selectJamTracks = (state) => state.media.jamTracks;
export const selectRecordedTracks = (state) => state.media.recordedTracks;
export const selectJamTrackState = (state) => state.media.jamTrackState;
export const selectDownloadingJamTrack = (state) => state.media.downloadingJamTrack;
export const selectMediaLoading = (state) => state.media.loading;
export const selectMediaError = (state) => state.media.error;

View File

@ -0,0 +1,315 @@
import { createSlice } from '@reduxjs/toolkit';
const initialState = {
// Core mixer collections
chatMixer: null,
broadcastMixer: null,
recordingMixer: null,
// Mixer arrays by type (populated by groupMixersByType)
recordingTrackMixers: [],
backingTrackMixers: [],
jamTrackMixers: [],
metronomeTrackMixers: [],
adhocTrackMixers: [],
// Master and personal mixer arrays (source data from jamClient)
masterMixers: [],
personalMixers: [],
// Lookup tables (computed by organizeMixers reducer)
allMixers: {}, // Format: { 'M123': mixer, 'P456': mixer }
mixersByResourceId: {}, // Format: { 'rid': { master: mixer, personal: mixer } }
mixersByTrackId: {}, // Format: { 'trackId': { master: mixer, personal: mixer } }
// Simulated/derived mixers for category controls
simulatedMusicCategoryMixers: {
PERSONAL: null,
MASTER: null
},
simulatedChatCategoryMixers: {
PERSONAL: null,
MASTER: null
},
// Metronome state
metronome: null, // Current metronome mixer object
metronomeSettings: {
tempo: 120,
sound: "Beep",
cricket: false
},
// Media summary (open/closed states for media types)
// This tracks what media is currently open in the session
mediaSummary: {
mediaOpen: false,
backingTrackOpen: false,
jamTrackOpen: false,
recordingOpen: false,
metronomeOpen: false,
isOpener: false,
userNeedsMediaControls: false,
jamTrack: null
},
// Peer state (for managing missing audio users)
noAudioUsers: {},
clientsWithAudioOverride: {},
missingMixerPeers: {},
checkingMissingPeers: {},
// Ready state (indicates mixers have been organized and are ready to use)
isReady: false,
// Loading/errors
loading: false,
error: null
};
export const mixersSlice = createSlice({
name: 'mixers',
initialState,
reducers: {
// Initialize from jamClient.SessionGetAllControlState
setMasterMixers: (state, action) => {
state.masterMixers = action.payload;
},
setPersonalMixers: (state, action) => {
state.personalMixers = action.payload;
},
// Organize mixers - builds lookup tables (matches useMixerHelper organizeMixers logic)
// This should be called after master/personal mixers are set
organizeMixers: (state) => {
const newAllMixers = {};
const newMixersByResourceId = {};
const newMixersByTrackId = {};
// Process master mixers
for (const masterMixer of state.masterMixers) {
newAllMixers['M' + masterMixer.id] = masterMixer;
const mixerPair = {};
newMixersByResourceId[masterMixer.rid] = mixerPair;
newMixersByTrackId[masterMixer.id] = mixerPair;
mixerPair.master = masterMixer;
}
// Process personal mixers
for (const personalMixer of state.personalMixers) {
newAllMixers['P' + personalMixer.id] = personalMixer;
let mixerPair = newMixersByResourceId[personalMixer.rid];
if (!mixerPair) {
// Create new pair if no master exists (e.g., MonitorGroup)
mixerPair = {};
newMixersByResourceId[personalMixer.rid] = mixerPair;
}
newMixersByTrackId[personalMixer.id] = mixerPair;
mixerPair.personal = personalMixer;
}
// Update state
state.allMixers = { ...state.allMixers, ...newAllMixers };
state.mixersByResourceId = { ...state.mixersByResourceId, ...newMixersByResourceId };
state.mixersByTrackId = { ...state.mixersByTrackId, ...newMixersByTrackId };
state.isReady = true;
},
// Group mixers by type - categorizes mixers into recording, backing, jam, metronome, adhoc
// This complex logic will be implemented in useMixerHelper Redux version
// For now, just accept the categorized arrays
setRecordingTrackMixers: (state, action) => {
state.recordingTrackMixers = action.payload;
},
setBackingTrackMixers: (state, action) => {
state.backingTrackMixers = action.payload;
},
setJamTrackMixers: (state, action) => {
state.jamTrackMixers = action.payload;
},
setMetronomeTrackMixers: (state, action) => {
state.metronomeTrackMixers = action.payload;
},
setAdhocTrackMixers: (state, action) => {
state.adhocTrackMixers = action.payload;
},
// Update individual mixer (for real-time updates from jamClient)
updateMixer: (state, action) => {
const { mixerId, mode, updates } = action.payload;
const key = (mode ? 'M' : 'P') + mixerId;
if (state.allMixers[key]) {
state.allMixers[key] = { ...state.allMixers[key], ...updates };
// Also update in master/personal arrays
if (mode) {
const index = state.masterMixers.findIndex(m => m.id === mixerId);
if (index !== -1) {
state.masterMixers[index] = { ...state.masterMixers[index], ...updates };
}
} else {
const index = state.personalMixers.findIndex(m => m.id === mixerId);
if (index !== -1) {
state.personalMixers[index] = { ...state.personalMixers[index], ...updates };
}
}
}
},
// Metronome
setMetronome: (state, action) => {
state.metronome = action.payload;
state.mediaSummary.metronomeOpen = !!action.payload;
},
setMetronomeSettings: (state, action) => {
state.metronomeSettings = { ...state.metronomeSettings, ...action.payload };
},
// Media summary updates
updateMediaSummary: (state, action) => {
state.mediaSummary = { ...state.mediaSummary, ...action.payload };
},
setMediaSummary: (state, action) => {
state.mediaSummary = action.payload;
},
// Chat/broadcast mixers
setChatMixer: (state, action) => {
state.chatMixer = action.payload;
},
setBroadcastMixer: (state, action) => {
state.broadcastMixer = action.payload;
},
setRecordingMixer: (state, action) => {
state.recordingMixer = action.payload;
},
// Simulated mixers (for category controls)
setSimulatedMusicCategoryMixers: (state, action) => {
state.simulatedMusicCategoryMixers = action.payload;
},
setSimulatedChatCategoryMixers: (state, action) => {
state.simulatedChatCategoryMixers = action.payload;
},
// Peer management
addNoAudioUser: (state, action) => {
state.noAudioUsers[action.payload] = true;
},
removeNoAudioUser: (state, action) => {
delete state.noAudioUsers[action.payload];
},
setNoAudioUsers: (state, action) => {
state.noAudioUsers = action.payload;
},
setClientsWithAudioOverride: (state, action) => {
state.clientsWithAudioOverride = action.payload;
},
setMissingMixerPeers: (state, action) => {
state.missingMixerPeers = action.payload;
},
setCheckingMissingPeers: (state, action) => {
state.checkingMissingPeers = action.payload;
},
// Clear on session end
clearMixers: (state) => {
return { ...initialState };
}
}
});
export const {
setMasterMixers,
setPersonalMixers,
organizeMixers,
setRecordingTrackMixers,
setBackingTrackMixers,
setJamTrackMixers,
setMetronomeTrackMixers,
setAdhocTrackMixers,
updateMixer,
setMetronome,
setMetronomeSettings,
updateMediaSummary,
setMediaSummary,
setChatMixer,
setBroadcastMixer,
setRecordingMixer,
setSimulatedMusicCategoryMixers,
setSimulatedChatCategoryMixers,
addNoAudioUser,
removeNoAudioUser,
setNoAudioUsers,
setClientsWithAudioOverride,
setMissingMixerPeers,
setCheckingMissingPeers,
clearMixers
} = mixersSlice.actions;
export default mixersSlice.reducer;
// Selectors
export const selectAllMixers = (state) => state.mixers.allMixers;
export const selectMasterMixers = (state) => state.mixers.masterMixers;
export const selectPersonalMixers = (state) => state.mixers.personalMixers;
export const selectMixersByResourceId = (state) => state.mixers.mixersByResourceId;
export const selectMixersByTrackId = (state) => state.mixers.mixersByTrackId;
export const selectChatMixer = (state) => state.mixers.chatMixer;
export const selectBroadcastMixer = (state) => state.mixers.broadcastMixer;
export const selectRecordingMixer = (state) => state.mixers.recordingMixer;
export const selectMetronome = (state) => state.mixers.metronome;
export const selectMetronomeSettings = (state) => state.mixers.metronomeSettings;
export const selectMediaSummary = (state) => state.mixers.mediaSummary;
export const selectMixersReady = (state) => state.mixers.isReady;
export const selectBackingTrackMixers = (state) => state.mixers.backingTrackMixers;
export const selectJamTrackMixers = (state) => state.mixers.jamTrackMixers;
export const selectRecordingTrackMixers = (state) => state.mixers.recordingTrackMixers;
export const selectMetronomeTrackMixers = (state) => state.mixers.metronomeTrackMixers;
export const selectAdhocTrackMixers = (state) => state.mixers.adhocTrackMixers;
export const selectSimulatedMusicCategoryMixers = (state) => state.mixers.simulatedMusicCategoryMixers;
export const selectSimulatedChatCategoryMixers = (state) => state.mixers.simulatedChatCategoryMixers;
export const selectNoAudioUsers = (state) => state.mixers.noAudioUsers;
export const selectClientsWithAudioOverride = (state) => state.mixers.clientsWithAudioOverride;
export const selectMissingMixerPeers = (state) => state.mixers.missingMixerPeers;
export const selectCheckingMissingPeers = (state) => state.mixers.checkingMissingPeers;
// Computed selector for getting a specific mixer by ID and mode
export const selectMixer = (mixerId, mode) => (state) => {
const key = (mode ? 'M' : 'P') + mixerId;
return state.mixers.allMixers[key];
};
// Computed selector for getting mixer pair by resource ID
export const selectMixerPairByResourceId = (resourceId) => (state) => {
return state.mixers.mixersByResourceId[resourceId];
};
// Computed selector for getting mixer pair by track ID
export const selectMixerPairByTrackId = (trackId) => (state) => {
return state.mixers.mixersByTrackId[trackId];
};

View File

@ -27,6 +27,24 @@ const initialState = {
showParticipantVideos: true,
showMixer: false,
sidebarCollapsed: false
},
// Media UI state (Phase 5: from MediaContext)
mediaUI: {
showMyMixes: false,
showCustomMixes: false,
editingMixdownId: null,
creatingMixdown: false,
createMixdownErrors: null
},
// Mixer UI state (Phase 5: from useMixerHelper/useMixerStore)
mixerUI: {
mixMode: 'PERSONAL', // 'PERSONAL' | 'MASTER'
currentMixerRange: {
min: null,
max: null
}
}
};
@ -95,6 +113,48 @@ export const sessionUISlice = createSlice({
state.view.sidebarCollapsed = action.payload;
},
// Phase 5: Media UI reducers
setShowMyMixes: (state, action) => {
state.mediaUI.showMyMixes = action.payload;
},
toggleMyMixes: (state) => {
state.mediaUI.showMyMixes = !state.mediaUI.showMyMixes;
},
setShowCustomMixes: (state, action) => {
state.mediaUI.showCustomMixes = action.payload;
},
toggleCustomMixes: (state) => {
state.mediaUI.showCustomMixes = !state.mediaUI.showCustomMixes;
},
setEditingMixdownId: (state, action) => {
state.mediaUI.editingMixdownId = action.payload;
},
setCreatingMixdown: (state, action) => {
state.mediaUI.creatingMixdown = action.payload;
},
setCreateMixdownErrors: (state, action) => {
state.mediaUI.createMixdownErrors = action.payload;
},
// Phase 5: Mixer UI reducers
setMixMode: (state, action) => {
state.mixerUI.mixMode = action.payload;
},
toggleMixMode: (state) => {
state.mixerUI.mixMode = state.mixerUI.mixMode === 'PERSONAL' ? 'MASTER' : 'PERSONAL';
},
setCurrentMixerRange: (state, action) => {
state.mixerUI.currentMixerRange = action.payload;
},
// Reset UI state (useful when leaving session)
resetUI: (state) => {
return { ...initialState };
@ -117,6 +177,18 @@ export const {
setShowMixer,
toggleSidebar,
setSidebarCollapsed,
// Phase 5: Media UI actions
setShowMyMixes,
toggleMyMixes,
setShowCustomMixes,
toggleCustomMixes,
setEditingMixdownId,
setCreatingMixdown,
setCreateMixdownErrors,
// Phase 5: Mixer UI actions
setMixMode,
toggleMixMode,
setCurrentMixerRange,
resetUI
} = sessionUISlice.actions;
@ -135,3 +207,16 @@ export const selectShowParticipantVideos = (state) => state.sessionUI.view.showP
export const selectShowMixer = (state) => state.sessionUI.view.showMixer;
export const selectSidebarCollapsed = (state) => state.sessionUI.view.sidebarCollapsed;
export const selectViewPreferences = (state) => state.sessionUI.view;
// Phase 5: Media UI selectors
export const selectShowMyMixes = (state) => state.sessionUI.mediaUI.showMyMixes;
export const selectShowCustomMixes = (state) => state.sessionUI.mediaUI.showCustomMixes;
export const selectEditingMixdownId = (state) => state.sessionUI.mediaUI.editingMixdownId;
export const selectCreatingMixdown = (state) => state.sessionUI.mediaUI.creatingMixdown;
export const selectCreateMixdownErrors = (state) => state.sessionUI.mediaUI.createMixdownErrors;
export const selectMediaUI = (state) => state.sessionUI.mediaUI;
// Phase 5: Mixer UI selectors
export const selectMixMode = (state) => state.sessionUI.mixerUI.mixMode;
export const selectCurrentMixerRange = (state) => state.sessionUI.mixerUI.currentMixerRange;
export const selectMixerUI = (state) => state.sessionUI.mixerUI;

View File

@ -13,6 +13,8 @@ import myJamTracksSlice from "./features/myJamTracksSlice"
import jamTrackSlice from "./features/jamTrackSlice"
import activeSessionReducer from "./features/activeSessionSlice"
import sessionUIReducer from "./features/sessionUISlice"
import mixersReducer from "./features/mixersSlice"
import mediaReducer from "./features/mediaSlice"
export default configureStore({
reducer: {
@ -23,6 +25,8 @@ export default configureStore({
session: sessionReducer, // this is the slice that holds the session lists
activeSession: activeSessionReducer, // this is the slice that holds the currently active session
sessionUI: sessionUIReducer, // this is the slice that holds the session UI state (modals, panels)
mixers: mixersReducer, // Phase 5: mixer state (chat, broadcast, track mixers, metronome, media summary)
media: mediaReducer, // Phase 5: media data (backing tracks, jam tracks, recorded tracks)
latency: latencyReducer,
onlineMusician: onlineMusicianReducer,
lobbyChat: lobbyChatMessagesReducer,