This commit is contained in:
Stephan Schellworth 2026-01-05 08:13:03 +01:00
commit dd79895c97
72 changed files with 4083 additions and 890 deletions

View file

@ -0,0 +1,575 @@
# Dashboard Log Polling and Rendering Documentation
## Overview
This documentation explains the complete flow of how dashboard messages (logs with `operationId`) are polled, processed, sorted, and rendered in the workflow dashboard. The system uses a hierarchical tree structure to display operations and their progress, with real-time updates through polling.
## Architecture Flow
The system follows this flow:
1. **Polling Controller** (`workflowPollingController.js`) - Manages polling intervals and scheduling
2. **Data Layer** (`workflowData.js`) - Fetches data from API and routes logs to appropriate handlers
3. **Dashboard Processor** (`workflowUiRendererDashboard.js`) - Processes logs with `operationId` and builds hierarchical tree
4. **Dashboard Renderer** (`workflowUiRendererDashboard.js`) - Renders the hierarchical tree structure
## Key Files
- `workflowPollingController.js` - Centralized polling controller
- `workflowData.js` - API communication and data routing
- `workflowUiRendererDashboard.js` - Dashboard log processing and rendering
- `workflowCoordination.js` - State management coordination
## Implementation Details
### 1. Polling Mechanism
**File**: `frontend_agents/public/js/modules/workflowPollingController.js`
The polling controller uses a recursive `setTimeout` approach to create an infinite polling chain. This ensures continuous updates while preventing race conditions and rate limiting issues.
#### Configuration
- **Base interval**: 5 seconds (`baseInterval = 5000`)
- **Maximum interval**: 10 seconds (`maxInterval = 10000`)
- **Exponential backoff multiplier**: 1.5
- **Concurrency prevention**: Uses `isPollInProgress` flag to prevent multiple simultaneous polls
#### Key Methods
**`startPolling(workflowId)`**
- Starts polling for a specific workflow
- Stops any existing polling before starting new one
- Sets `activeWorkflowId` and `isPolling` flag
- Executes immediate first poll (no delay)
- Validates workflow ID before starting
**`doPolling()`**
- Executes one poll cycle asynchronously
- Prevents concurrent execution using `isPollInProgress` flag
- Calls `pollWorkflowData()` from `workflowData.js`
- Handles errors and implements exponential backoff on failures
- Self-schedules next poll using recursive `setTimeout`
- Validates workflow is still valid before scheduling next poll
**`stopPolling()`**
- Stops all polling operations immediately
- Clears all scheduled timeouts
- Resets all state flags (`isPolling`, `isPollInProgress`, `activeWorkflowId`)
- Resets failure count
**`pausePolling()` / `resumePolling()`**
- Temporarily pauses polling (e.g., during user interactions)
- Resumes polling after pause
#### Polling Flow
```javascript
startPolling(workflowId)
doPolling() [immediate first poll]
pollWorkflowData(workflowId) [async API call]
setTimeout(() => doPolling(), interval) [schedule next poll]
[recursive loop continues until stopped]
```
#### Error Handling
- **Rate limiting (429 errors)**: Increases backoff more aggressively, stops polling after 5 consecutive rate limit errors
- **Network errors**: Logged but don't immediately stop polling (allows retry)
- **Workflow validation**: Checks if workflow is still valid before each poll cycle
- **Poll failures**: Exponential backoff increases interval up to `maxInterval`
### 2. Data Fetching
**File**: `frontend_agents/public/js/modules/workflowData.js`
The `pollWorkflowData()` function orchestrates the data fetching process.
#### API Calls
The function makes two parallel API calls:
1. **`api.getWorkflow(workflowId)`** - Fetches workflow status and metadata
2. **`api.getWorkflowChatData(workflowId, afterTimestamp)`** - Fetches unified chat data (messages, logs, stats)
#### Incremental Polling
- **First poll**: `afterTimestamp = null` → Fetches ALL historical data
- **Subsequent polls**: `afterTimestamp = workflowState.lastRenderedTimestamp` → Fetches only new items since last render
- **Timestamp tracking**: Uses `createdAt` timestamp from each item to track what's been rendered
#### Data Processing
The `processUnifiedChatData()` function processes items in chronological order:
1. Routes each item based on `type` field:
- `'message'``processUnifiedMessage()`
- `'log'``processUnifiedLog()`
- `'stat'``processUnifiedStat()`
2. Updates `lastRenderedTimestamp` after processing each item (ensures accurate incremental polling)
3. Processes items sequentially to maintain chronological order
#### Workflow Status Updates
- Monitors workflow status changes
- Updates UI buttons and controls when status changes
- Handles special case: Ignores 'completed' status if workflow is in Round 2+ (prevents premature stopping)
#### Polling Continuation Logic
Polling continues based on workflow status:
- **'running'**: Continues polling
- **'completed'**: Continues polling temporarily to get final messages, then stops
- **'failed' / 'stopped'**: Stops polling immediately
- **Other statuses**: Stops polling
### 3. Log Routing
**File**: `frontend_agents/public/js/modules/workflowData.js` - `processUnifiedLog()`
Logs are routed to different rendering areas based on the presence of `operationId`:
#### Routing Logic
```javascript
if (log.operationId) {
// Logs WITH operationId → Dashboard
processDashboardLogs([frontendLog]);
} else {
// Logs WITHOUT operationId → Unified Content Area
WorkflowCoordination.addLogEntry(frontendLog.message, frontendLog.type, frontendLog);
}
```
#### Log Format Conversion
Backend `ChatLog` format is converted to frontend format:
```javascript
{
id: log.id,
message: log.message,
type: log.type || 'info',
timestamp: log.timestamp,
status: log.status || 'running',
progress: log.progress !== undefined && log.progress !== null ? log.progress : undefined,
performance: log.performance,
operationId: log.operationId || null,
parentId: log.parentId || null
}
```
#### Key Points
- **All logs are processed**: No duplicates are skipped (logs may contain progress updates)
- **Progress tracking**: Logs with `operationId` typically contain progress information
- **Hierarchical structure**: `parentId` field enables parent-child relationships between operations
### 4. Dashboard Log Processing
**File**: `frontend_agents/public/js/modules/workflowUiRendererDashboard.js` - `processDashboardLogs()`
This function processes logs with `operationId` and builds the hierarchical tree structure.
#### Processing Steps
1. **Group by operationId**
- Creates or updates operation groups in `dashboardLogTree.operations` Map
- Each operation stores logs in a Map keyed by `logId` (ensures uniqueness)
2. **Update operation metadata**
- Updates `parentId` if not set yet (from first log entry)
- Updates `latestProgress` when log contains progress value
- Updates `latestStatus` when log contains status value
3. **Generate unique log IDs**
- Uses provided `log.id` if available
- Otherwise generates: `log_${Date.now()}_${Math.random().toString(36).substring(2, 9)}`
- Ensures all progress updates are stored, even with same progress value
4. **Build root operations list**
- Filters operations without `parentId`
- Stores in `dashboardLogTree.rootOperations` array
5. **Trigger rendering**
- Calls `renderDashboard()` after processing all logs
#### Data Structure
```javascript
dashboardLogTree = {
operations: Map<operationId, {
logs: Map<logId, log>, // All logs for this operation
parentId: string | null, // Parent operation ID (if nested)
expanded: boolean, // UI expanded/collapsed state
latestProgress: number | null, // Most recent progress value
latestStatus: string | null // Most recent status value
}>,
rootOperations: string[], // Operation IDs without parent
logExpandedStates: Map<logId, boolean>, // Individual log expanded states
currentRound: number | null // Current workflow round
}
```
#### Important Behaviors
- **All logs stored**: Every log with same `operationId` is stored (represents progress updates)
- **Latest values tracked**: `latestProgress` and `latestStatus` always reflect most recent state
- **Parent-child relationships**: Operations can nest via `parentId` field
### 5. Sorting
**File**: `frontend_agents/public/js/modules/workflowUiRendererDashboard.js`
Multiple sorting mechanisms ensure consistent display order:
#### Operation-Level Log Sorting
**Location**: `renderOperationNode()` function, lines 169-173
Logs within an operation are sorted by timestamp in ascending order:
```javascript
const logsArray = Array.from(operation.logs.values()).sort((a, b) => {
const tsA = a.timestamp || 0;
const tsB = b.timestamp || 0;
return tsA - tsB; // Ascending order (oldest first)
});
```
**Purpose**: Ensures logs are displayed in chronological order within each operation.
#### Child Operations Sorting
**Location**: `getChildOperations()` function, line 453
Child operations are sorted alphabetically by `operationId`:
```javascript
return Array.from(dashboardLogTree.operations.entries())
.filter(([opId, op]) => op.parentId === parentId)
.map(([opId]) => opId)
.sort(); // Alphabetical sort for consistent ordering
```
**Purpose**: Provides consistent, predictable ordering of sibling operations.
#### Timeline Sorting (Unified Content)
**Location**: `workflowUiRenderer.js` - `renderUnifiedContent()` function
Logs without `operationId` are combined with messages and sorted by timestamp:
```javascript
timeline.sort((a, b) => a.timestamp - b.timestamp);
```
**Purpose**: Creates a unified chronological timeline of all non-dashboard content.
#### Sorting Summary
| Context | Sort Key | Order | Purpose |
|---------|----------|-------|---------|
| Logs within operation | `timestamp` | Ascending | Chronological display |
| Child operations | `operationId` | Alphabetical | Consistent ordering |
| Unified timeline | `timestamp` | Ascending | Chronological timeline |
### 6. Rendering
**File**: `frontend_agents/public/js/modules/workflowUiRendererDashboard.js` - `renderDashboard()`
The rendering system creates a hierarchical tree structure with collapsible nodes and progress indicators.
#### Hierarchical Structure
- **Root operations**: Operations without `parentId` are rendered first
- **Child operations**: Operations with `parentId` matching a parent's `operationId` are nested
- **Single line per operation**: Each operation shows ONE line that updates with latest status/progress
- **All logs represented**: All logs with same `operationId` are represented by this single updating line
#### Rendering Process
**Step 1: `renderDashboard()`**
- Builds HTML from `dashboardLogTree` structure
- Handles empty state (no operations)
- Sets up event handlers for collapse/expand functionality
**Step 2: `renderOperationNode(operationId, depth)`** (Recursive)
- Renders a single operation node
- Calculates indentation based on depth (8px per level)
- Determines if operation has child operations
- Gets latest log entry for operation name and type
- Calculates progress percentage (forces 100% when status is 'completed')
- Builds HTML for:
- Expand/collapse button (if has children)
- Operation icon (based on log type)
- Operation name (from latest log message)
- Status and progress percentage
- Progress bar (if progress available)
- Recursively renders child operations if expanded
#### Visual Elements
**Operation Header**
- Expand/collapse button (chevron icon)
- Operation icon (info/success/error/warning)
- Operation name (from latest log message)
- Status badge (running/completed/failed/etc.)
- Progress percentage (if available)
**Progress Bar**
- Visual progress indicator
- Width based on progress percentage (0-100%)
- "completed" class when progress >= 100%
- Hidden if no progress value
**Indentation**
- Root level (depth 0): No indentation
- Child levels: Indented via parent container padding (8px per level)
- Creates visual hierarchy
#### State Management
**Expanded/Collapsed State**
- Stored in `operation.expanded` boolean
- Toggled via `toggleOperationExpanded(operationId)`
- Persists during re-renders
- Controls visibility of child operations container
**Event Handlers**
- `setupCollapseExpandHandlers()`: Sets up click handlers for expand buttons
- `setupLogCollapseExpandHandlers()`: Sets up handlers for log entry expansion
- Click handlers toggle expanded state and re-render dashboard
#### Rendering Flow
```
renderDashboard()
[For each root operation]
renderOperationNode(operationId, 0)
[Build operation header HTML]
[If has children and expanded]
[For each child operation]
renderOperationNode(childOperationId, depth)
[Recursive rendering continues...]
[Set innerHTML of dashboard container]
[Setup event handlers]
```
#### Key Rendering Features
1. **Progress Updates**: Operation line updates in-place as new logs arrive
2. **Status Changes**: Status badge updates when operation status changes
3. **Collapsible Tree**: Users can expand/collapse operation groups
4. **Visual Hierarchy**: Indentation shows parent-child relationships
5. **Latest State**: Always shows most recent log message, progress, and status
## Data Structures
### Dashboard Log Tree
```javascript
{
operations: Map<operationId, {
logs: Map<logId, log>, // All logs for this operation
parentId: string | null, // Parent operation ID
expanded: boolean, // UI expanded state
latestProgress: number | null, // Most recent progress (0-1)
latestStatus: string | null // Most recent status
}>,
rootOperations: string[], // Operation IDs without parent
logExpandedStates: Map<logId, boolean>, // Individual log expanded states
currentRound: number | null // Current workflow round
}
```
### Log Entry Format
```javascript
{
id: string, // Unique log ID
message: string, // Log message text
type: 'info' | 'success' | 'error' | 'warning',
timestamp: number, // Unix timestamp (seconds)
status: string, // Operation status
progress: number | null, // Progress value (0-1) or null
operationId: string | null, // Operation ID (null = unified content)
parentId: string | null // Parent operation ID (for nesting)
}
```
### Unified Chat Data Item
```javascript
{
type: 'message' | 'log' | 'stat', // Item type
item: { /* message/log/stat data */ },
createdAt: number // Timestamp for sorting
}
```
## Key Features
### 1. Incremental Polling
- Uses `lastRenderedTimestamp` to fetch only new items
- First poll loads all historical data (`afterTimestamp = null`)
- Subsequent polls fetch incrementally (`afterTimestamp = lastRenderedTimestamp`)
- Reduces API load and improves performance
### 2. Hierarchical Display
- Operations can have parent-child relationships via `parentId`
- Visual indentation shows hierarchy
- Collapsible tree structure for better UX
- Supports unlimited nesting depth
### 3. Progress Tracking
- Shows progress bars for operations with progress values
- Updates in real-time as new logs arrive
- Forces 100% progress when status is 'completed'
- Displays status badges (running/completed/failed/etc.)
### 4. Collapsible Tree
- Users can expand/collapse operation groups
- Expand/collapse state persists during re-renders
- Click handlers on operation headers and expand buttons
- Smooth visual transitions
### 5. Round Detection
- Tracks current workflow round in `dashboardLogTree.currentRound`
- Clears dashboard when round changes (via `updateProgressFromMessage()`)
- Prevents mixing data from different workflow rounds
### 6. Duplicate Prevention
- Uses Map with `logId` keys to prevent duplicate entries
- Same log ID updates in place rather than creating duplicates
- Ensures unique log entries even with same progress value
## Error Handling
### Rate Limiting (429 Errors)
- Detected in `pollWorkflowData()` and `doPolling()`
- Triggers exponential backoff with increased multiplier
- Stops polling after 5 consecutive rate limit errors
- Prevents API abuse
### Network Errors
- Logged but don't immediately stop polling
- Allows retry on transient network issues
- Controller handles backoff automatically
- Polling continues for recoverable errors
### Rendering Errors
- Don't stop polling (UI issue, not data issue)
- Logged for debugging
- Polling continues to get workflow status updates
- UI can recover on next successful render
### Workflow Validation
- `isWorkflowValid()` checks before each poll cycle
- Validates workflow state exists and matches active workflow
- Checks if polling is still enabled (`pollActive` flag)
- Stops polling if workflow is invalid
## Performance Considerations
### Polling Intervals
- Base interval: 5 seconds (balanced between responsiveness and server load)
- Maximum interval: 10 seconds (prevents excessive backoff)
- Exponential backoff: Prevents overwhelming server during errors
### Data Processing
- Processes items sequentially to maintain chronological order
- Uses Maps for O(1) lookups when grouping operations
- Incremental polling reduces data transfer
- Timestamp-based filtering at API level
### Rendering Optimization
- Full re-render on each update (simplifies state management)
- Event handlers re-attached after each render
- HTML generation is efficient (string concatenation)
- Minimal DOM manipulation (innerHTML replacement)
## Usage Examples
### Starting Polling
```javascript
import pollingController from './workflowPollingController.js';
// Start polling for a workflow
pollingController.startPolling('workflow-123');
```
### Stopping Polling
```javascript
// Stop polling
pollingController.stopPolling();
```
### Processing Dashboard Logs
```javascript
import { processDashboardLogs } from './workflowUiRendererDashboard.js';
// Process logs with operationId
const logs = [
{
id: 'log-1',
message: 'Processing file...',
type: 'info',
timestamp: 1234567890,
status: 'running',
progress: 0.5,
operationId: 'op-123',
parentId: null
}
];
processDashboardLogs(logs);
```
### Clearing Dashboard
```javascript
import { clearDashboard } from './workflowUiRendererDashboard.js';
// Clear dashboard (e.g., on workflow reset)
clearDashboard(true); // true = reset round tracking
```
## Related Documentation
- `FRONTEND_ARCHITECTURE.md` - Overall frontend architecture
- `workflowCoordination.js` - State management coordination
- `workflowUiRenderer.js` - Unified content rendering
## Conclusion
The dashboard log polling and rendering system provides a robust, hierarchical display of workflow operations with real-time updates. The system efficiently handles incremental polling, sorts data chronologically, and renders a collapsible tree structure that scales to complex workflows with multiple nested operations.

View file

@ -96,7 +96,11 @@ api.interceptors.response.use(
error.config?.url?.includes('/api/local/login') ||
error.config?.url?.includes('/api/msft/login');
if (!isLoginEndpoint) {
// Don't redirect if we're already on the login page (prevents redirect loops)
const isOnLoginPage = window.location.pathname === '/login' ||
window.location.pathname.startsWith('/login');
if (!isLoginEndpoint && !isOnLoginPage) {
// Clear local auth data (httpOnly cookies are cleared by backend)
sessionStorage.removeItem('auth_authority');
clearUserDataCache();
@ -104,6 +108,13 @@ api.interceptors.response.use(
window.location.href = '/login';
}
}
// Handle rate limiting (429) - don't throw, just log and return error
if (error.response?.status === 429) {
console.warn('Rate limit exceeded (429). Please wait before making more requests.');
// Don't cause cascading errors by throwing here
}
return Promise.reject(error);
}
);

132
src/api/attributesApi.ts Normal file
View file

@ -0,0 +1,132 @@
import { ApiRequestOptions } from '../hooks/useApi';
// ============================================================================
// TYPES & INTERFACES
// ============================================================================
export interface AttributeDefinition {
name: string;
label: string;
type: 'string' | 'number' | 'date' | 'boolean' | 'enum' | 'text' | 'email' | 'checkbox' | 'select' | 'multiselect' | 'textarea';
sortable?: boolean;
filterable?: boolean;
searchable?: boolean;
width?: number;
minWidth?: number;
maxWidth?: number;
filterOptions?: string[];
description?: string;
required?: boolean;
default?: any;
options?: Array<{ value: string | number; label: string | { [key: string]: string } }> | string;
validation?: any;
ui?: any;
readonly?: boolean;
editable?: boolean;
visible?: boolean;
order?: number;
placeholder?: string;
}
// Type for the request function passed to API functions
export type ApiRequestFunction = (options: ApiRequestOptions<any>) => Promise<any>;
// ============================================================================
// API REQUEST FUNCTIONS
// ============================================================================
/**
* Generic function to fetch attributes for any entity type
* Endpoint: GET /api/attributes/{entityType}
*/
export async function fetchAttributes(
request: ApiRequestFunction,
entityType: string
): Promise<AttributeDefinition[]> {
const data = await request({
url: `/api/attributes/${entityType}`,
method: 'get'
});
// Extract attributes from response - check if response.data.attributes exists, otherwise check if response.data is an array
let attrs: AttributeDefinition[] = [];
if (data?.attributes && Array.isArray(data.attributes)) {
attrs = data.attributes;
} else if (Array.isArray(data)) {
attrs = data;
} else if (data && typeof data === 'object') {
// Try to find any array property in the response
const keys = Object.keys(data);
for (const key of keys) {
if (Array.isArray(data[key])) {
attrs = data[key];
break;
}
}
}
return attrs;
}
/**
* Fetch connection attributes from backend
* Endpoint: GET /api/attributes/UserConnection
*/
export async function fetchConnectionAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
return fetchAttributes(request, 'UserConnection');
}
/**
* Fetch file attributes from backend
* Endpoint: GET /api/attributes/FileItem
*/
export async function fetchFileAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
const data = await request({
url: '/api/attributes/FileItem',
method: 'get'
});
// Handle different response formats
if (Array.isArray(data)) {
return data;
}
if (data && typeof data === 'object' && 'attributes' in data && Array.isArray(data.attributes)) {
return data.attributes;
}
// Try to find any array property in the response
if (data && typeof data === 'object') {
const keys = Object.keys(data);
for (const key of keys) {
if (Array.isArray((data as any)[key])) {
return (data as any)[key];
}
}
}
return [];
}
/**
* Fetch prompt attributes from backend
* Endpoint: GET /api/attributes/Prompt
*/
export async function fetchPromptAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
return fetchAttributes(request, 'Prompt');
}
/**
* Fetch user attributes from backend
* Endpoint: GET /api/attributes/User
*/
export async function fetchUserAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
return fetchAttributes(request, 'User');
}
/**
* Fetch workflow attributes from backend
* Endpoint: GET /api/attributes/ChatWorkflow
*/
export async function fetchWorkflowAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
return fetchAttributes(request, 'ChatWorkflow');
}

View file

@ -171,10 +171,19 @@ export async function registerApi(registerData: RegisterData): Promise<RegisterR
headers
});
const userData: any = response.data;
return {
success: true,
message: 'Registration successful',
user: response.data
user: userData && typeof userData === 'object' && 'id' in userData ? {
id: String(userData.id || ''),
username: String(userData.username || ''),
email: String(userData.email || ''),
fullName: String(userData.fullName || ''),
language: String(userData.language || 'en'),
enabled: Boolean(userData.enabled !== false),
privilege: String(userData.privilege || 'user')
} : undefined
};
}
@ -186,7 +195,7 @@ export async function registerWithMsalApi(
request: ApiRequestFunction,
userData: MsalRegisterData
): Promise<RegisterResponse> {
const response = await request<RegisterResponse>({
const response = await request({
url: '/api/msft/register',
method: 'post',
data: userData,
@ -197,10 +206,19 @@ export async function registerWithMsalApi(
}
});
const responseData: any = response;
return {
success: true,
message: 'Registration successful',
user: response
user: responseData && typeof responseData === 'object' && 'id' in responseData ? {
id: String(responseData.id || ''),
username: String(responseData.username || ''),
email: String(responseData.email || ''),
fullName: String(responseData.fullName || ''),
language: String(responseData.language || 'en'),
enabled: Boolean((responseData as any).enabled !== false),
privilege: String((responseData as any).privilege || 'user')
} : undefined
};
}

View file

@ -78,7 +78,7 @@ export type ApiRequestFunction = (options: ApiRequestOptions<any>) => Promise<an
* Fetch connection attributes from backend
* Endpoint: GET /api/attributes/UserConnection
*/
export async function fetchConnectionAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
export async function fetchConnectionAttributes(_request: ApiRequestFunction): Promise<AttributeDefinition[]> {
// Note: This uses api.get directly due to response format handling
// For now, we'll use api.get directly in the hook as well
throw new Error('fetchConnectionAttributes should use api instance directly for response format handling');
@ -109,7 +109,7 @@ export async function fetchConnections(
}
}
const data = await request<PaginatedResponse<Connection> | Connection[]>({
const data = await request({
url: '/api/connections/',
method: 'get',
params: requestParams
@ -126,7 +126,7 @@ export async function createConnection(
request: ApiRequestFunction,
connectionData: CreateConnectionData
): Promise<Connection> {
return await request<Connection>({
return await request({
url: '/api/connections/',
method: 'post',
data: connectionData
@ -141,7 +141,7 @@ export async function connectService(
request: ApiRequestFunction,
connectionId: string
): Promise<ConnectResponse> {
return await request<ConnectResponse>({
return await request({
url: `/api/connections/${connectionId}/connect`,
method: 'post'
});
@ -155,7 +155,7 @@ export async function disconnectService(
request: ApiRequestFunction,
connectionId: string
): Promise<{ message: string }> {
return await request<{ message: string }>({
return await request({
url: `/api/connections/${connectionId}/disconnect`,
method: 'post'
});
@ -169,7 +169,7 @@ export async function deleteConnection(
request: ApiRequestFunction,
connectionId: string
): Promise<{ message: string }> {
return await request<{ message: string }>({
return await request({
url: `/api/connections/${connectionId}`,
method: 'delete'
});
@ -184,7 +184,7 @@ export async function updateConnection(
connectionId: string,
updateData: Partial<Connection>
): Promise<Connection> {
return await request<Connection>({
return await request({
url: `/api/connections/${connectionId}`,
method: 'put',
data: updateData
@ -199,7 +199,7 @@ export async function refreshMicrosoftToken(
request: ApiRequestFunction,
connectionId: string
): Promise<Connection> {
return await request<Connection>({
return await request({
url: `/api/connections/${connectionId}/refresh-microsoft-token`,
method: 'post'
});
@ -213,7 +213,7 @@ export async function refreshGoogleToken(
request: ApiRequestFunction,
connectionId: string
): Promise<Connection> {
return await request<Connection>({
return await request({
url: `/api/connections/${connectionId}/refresh-google-token`,
method: 'post'
});

View file

@ -58,7 +58,7 @@ export type ApiRequestFunction = (options: ApiRequestOptions<any>) => Promise<an
* Endpoint: GET /api/attributes/FileItem
*/
export async function fetchFileAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
const data = await request<AttributeDefinition[] | { attributes: AttributeDefinition[] }>({
const data = await request({
url: '/api/attributes/FileItem',
method: 'get'
});
@ -109,7 +109,7 @@ export async function fetchFiles(
}
}
const data = await request<PaginatedResponse<FileInfo> | FileInfo[]>({
const data = await request({
url: '/api/files/list',
method: 'get',
params: requestParams
@ -127,7 +127,7 @@ export async function fetchFileById(
fileId: string
): Promise<FileInfo | null> {
try {
const data = await request<FileInfo>({
const data = await request({
url: `/api/files/${fileId}`,
method: 'get'
});
@ -147,7 +147,7 @@ export async function updateFile(
fileId: string,
fileData: Partial<FileInfo>
): Promise<FileInfo> {
return await request<FileInfo>({
return await request({
url: `/api/files/${fileId}`,
method: 'put',
data: fileData

View file

@ -38,12 +38,34 @@ export async function fetchPermissions(
params.item = item;
}
const data = await request<UserPermissions>({
console.log('📡 fetchPermissions: Requesting permissions:', {
context,
item,
params,
url: '/api/rbac/permissions'
});
const data = await request({
url: '/api/rbac/permissions',
method: 'get',
params
});
console.log('📥 fetchPermissions: Received permissions response:', {
context,
item,
response: data,
view: data?.view,
read: data?.read,
create: data?.create,
update: data?.update,
delete: data?.delete,
type: typeof data,
isArray: Array.isArray(data),
keys: data ? Object.keys(data) : [],
fullResponse: JSON.stringify(data, null, 2)
});
return data;
}

View file

@ -84,7 +84,7 @@ export type ApiRequestFunction = (options: ApiRequestOptions<any>) => Promise<an
* Fetch prompt attributes from backend
* Endpoint: GET /api/attributes/Prompt
*/
export async function fetchPromptAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
export async function fetchPromptAttributes(_request: ApiRequestFunction): Promise<AttributeDefinition[]> {
// Note: This uses api.get directly due to response format handling
// For now, we'll use api.get directly in the hook as well
throw new Error('fetchPromptAttributes should use api instance directly for response format handling');
@ -115,7 +115,7 @@ export async function fetchPrompts(
}
}
const data = await request<PaginatedResponse<Prompt> | Prompt[]>({
const data = await request({
url: '/api/prompts',
method: 'get',
params: requestParams
@ -133,7 +133,7 @@ export async function fetchPromptById(
promptId: string
): Promise<Prompt | null> {
try {
const data = await request<Prompt>({
const data = await request({
url: `/api/prompts/${promptId}`,
method: 'get'
});
@ -152,7 +152,7 @@ export async function createPrompt(
request: ApiRequestFunction,
promptData: CreatePromptData
): Promise<Prompt> {
return await request<Prompt>({
return await request({
url: '/api/prompts',
method: 'post',
data: promptData
@ -168,7 +168,7 @@ export async function updatePrompt(
promptId: string,
promptData: UpdatePromptData
): Promise<Prompt> {
return await request<Prompt>({
return await request({
url: `/api/prompts/${promptId}`,
method: 'put',
data: promptData

View file

@ -11,7 +11,8 @@ export interface User {
fullName: string;
language: string;
enabled: boolean;
privilege: string;
privilege?: string; // Deprecated - use roleLabels instead
roleLabels?: string[]; // Array of role labels from backend (e.g., ["user"])
authenticationAuthority: string;
mandateId: string;
[key: string]: any; // Allow additional properties
@ -35,6 +36,8 @@ export interface AttributeDefinition {
minWidth?: number;
maxWidth?: number;
filterOptions?: string[];
readonly?: boolean;
editable?: boolean;
}
export interface PaginationParams {
@ -78,10 +81,23 @@ export async function fetchCurrentUser(
endpoint = '/api/google/me';
}
return await request<User>({
console.log('📡 fetchCurrentUser: Requesting user data from:', endpoint);
const response = await request({
url: endpoint,
method: 'get'
});
console.log('📥 fetchCurrentUser: Received response:', {
endpoint,
hasData: !!response,
username: response?.username,
roleLabels: response?.roleLabels,
privilege: response?.privilege,
allKeys: response ? Object.keys(response) : [],
fullResponse: response
});
return response;
}
/**
@ -108,7 +124,7 @@ export async function logoutUser(
* Fetch user attributes from backend
* Endpoint: GET /api/attributes/User
*/
export async function fetchUserAttributes(request: ApiRequestFunction): Promise<AttributeDefinition[]> {
export async function fetchUserAttributes(_request: ApiRequestFunction): Promise<AttributeDefinition[]> {
// Note: This uses api.get directly in the hook due to response format handling
// Keeping the function signature here for consistency, but implementation may need api instance
throw new Error('fetchUserAttributes should use api instance directly for response format handling');
@ -139,7 +155,7 @@ export async function fetchUsers(
}
}
const data = await request<PaginatedResponse<User> | User[]>({
const data = await request({
url: '/api/users/',
method: 'get',
params: requestParams
@ -157,7 +173,7 @@ export async function fetchUserById(
userId: string
): Promise<User | null> {
try {
const data = await request<User>({
const data = await request({
url: `/api/users/${userId}`,
method: 'get'
});
@ -176,7 +192,7 @@ export async function createUser(
request: ApiRequestFunction,
userData: Partial<User>
): Promise<User> {
return await request<User>({
return await request({
url: '/api/users',
method: 'post',
data: userData
@ -192,7 +208,7 @@ export async function updateUser(
userId: string,
userData: UserUpdateData
): Promise<User> {
return await request<User>({
return await request({
url: `/api/users/${userId}`,
method: 'put',
data: userData

View file

@ -72,11 +72,58 @@ export type ApiRequestFunction = (options: ApiRequestOptions<any>) => Promise<an
* Endpoint: GET /api/workflows/
*/
export async function fetchWorkflows(request: ApiRequestFunction): Promise<Workflow[]> {
const data = await request<Workflow[]>({
url: '/api/workflows/',
method: 'get'
});
return Array.isArray(data) ? data : [];
console.log('📤 fetchWorkflows: Making API request to /api/workflows/');
try {
const data = await request({
url: '/api/workflows/',
method: 'get'
});
console.log('📥 fetchWorkflows: API response:', data);
// Handle different response formats
let workflows: Workflow[] = [];
if (Array.isArray(data)) {
// Direct array response
workflows = data;
} else if (data && typeof data === 'object') {
// Check for common wrapper properties
if (Array.isArray(data.workflows)) {
workflows = data.workflows;
} else if (Array.isArray(data.data)) {
workflows = data.data;
} else if (Array.isArray(data.items)) {
workflows = data.items;
} else if (Array.isArray(data.results)) {
workflows = data.results;
} else {
// Try to find any array property
const keys = Object.keys(data);
for (const key of keys) {
if (Array.isArray(data[key])) {
workflows = data[key];
console.log(` fetchWorkflows: Found workflows array in property '${key}'`);
break;
}
}
}
}
// Validate that we have workflow objects with id property
const validWorkflows = workflows.filter((w: any) => w && typeof w === 'object' && w.id);
if (validWorkflows.length !== workflows.length) {
console.warn(`⚠️ fetchWorkflows: Filtered out ${workflows.length - validWorkflows.length} invalid workflows`);
}
console.log(`✅ fetchWorkflows: Returning ${validWorkflows.length} valid workflows`);
return validWorkflows;
} catch (error) {
console.error('❌ fetchWorkflows: Error fetching workflows:', error);
throw error;
}
}
/**
@ -87,7 +134,7 @@ export async function fetchWorkflow(
request: ApiRequestFunction,
workflowId: string
): Promise<Workflow & { messages?: WorkflowMessage[]; logs?: WorkflowLog[] }> {
return await request<any>({
return await request({
url: `/api/workflows/${workflowId}`,
method: 'get'
});
@ -101,7 +148,7 @@ export async function fetchWorkflowStatus(
request: ApiRequestFunction,
workflowId: string
): Promise<Workflow | { status: string } | null> {
const data = await request<any>({
const data = await request({
url: `/api/workflows/${workflowId}/status`,
method: 'get'
});
@ -127,7 +174,7 @@ export async function fetchWorkflowMessages(
messageId?: string
): Promise<WorkflowMessage[]> {
const params = messageId ? { messageId } : undefined;
const data = await request<any>({
const data = await request({
url: `/api/workflows/${workflowId}/messages`,
method: 'get',
params
@ -160,7 +207,7 @@ export async function fetchWorkflowLogs(
logId?: string
): Promise<WorkflowLog[]> {
const params = logId ? { logId } : undefined;
const data = await request<any>({
const data = await request({
url: `/api/workflows/${workflowId}/logs`,
method: 'get',
params
@ -201,11 +248,62 @@ export async function fetchChatData(
console.log('📤 fetchChatData request:', requestConfig);
const data = await request<ChatDataResponse>(requestConfig);
const data = await request(requestConfig);
console.log('📥 fetchChatData response:', data);
// Ensure all arrays exist
// Handle unified items format: { items: [{ type: 'message'|'log'|'stat', item: {...}, createdAt: ... }] }
if (data.items && Array.isArray(data.items)) {
const messages: WorkflowMessage[] = [];
const logs: WorkflowLog[] = [];
const stats: WorkflowStats[] = [];
const documents: WorkflowDocument[] = [];
data.items.forEach((item: any) => {
if (item.type === 'message') {
// Handle both formats: item.item or direct item data
const messageData = item.item || item;
if (messageData && (messageData.id || messageData.message)) {
messages.push(messageData);
} else {
console.warn('⚠️ Invalid message item:', item);
}
} else if (item.type === 'log') {
const logData = item.item || item;
if (logData) {
logs.push(logData);
}
} else if (item.type === 'stat') {
const statData = item.item || item;
if (statData) {
stats.push(statData);
}
}
// Documents might be in items or separate
if (item.type === 'document') {
const docData = item.item || item;
if (docData) {
documents.push(docData);
}
}
});
console.log('📦 Extracted from items:', {
messages: messages.length,
logs: logs.length,
stats: stats.length,
documents: documents.length
});
return {
messages,
logs,
stats,
documents: documents.length > 0 ? documents : (Array.isArray(data.documents) ? data.documents : [])
};
}
// Fallback to direct format: { messages: [], logs: [], stats: [] }
return {
messages: Array.isArray(data.messages) ? data.messages : [],
logs: Array.isArray(data.logs) ? data.logs : [],
@ -261,7 +359,7 @@ export async function startWorkflowApi(
console.log(' Request Body:', JSON.stringify(requestBody, null, 2));
console.log(' Full Request Config:', JSON.stringify(requestConfig, null, 2));
const response = await request<StartWorkflowResponse>(requestConfig);
const response = await request(requestConfig);
console.log('📥 startWorkflow response:', response);
@ -291,7 +389,7 @@ export async function updateWorkflowApi(
workflowId: string,
updateData: Partial<{ name: string; description?: string; tags?: string[] }>
): Promise<Workflow> {
return await request<Workflow>({
return await request({
url: `/api/workflows/${workflowId}`,
method: 'put',
data: updateData
@ -396,7 +494,7 @@ export async function fetchAttributes(
request: ApiRequestFunction,
entityType: string = 'ChatWorkflow'
): Promise<AttributeDefinition[]> {
const data = await request<any>({
const data = await request({
url: `/api/attributes/${entityType}`,
method: 'get'
});

View file

@ -2,7 +2,7 @@ import React, { useState } from 'react';
import { MdModeEdit } from 'react-icons/md';
import { useLanguage } from '../../../../providers/language/LanguageContext';
import { Popup } from '../../../UiComponents/Popup';
import { FormGeneratorForm } from '../../FormGeneratorForm';
import { FormGeneratorForm, AttributeDefinition } from '../../FormGeneratorForm';
import styles from '../ActionButton.module.css';
export interface EditActionButtonProps<T = any> {
@ -154,16 +154,18 @@ export function EditActionButton<T = any>({
// Get the item ID from the row
const itemId = (editData as any)[idField];
// Get edit fields configuration
const fields = getEditFields();
// Get edit fields configuration from attributes
const attributes = getAttributes();
const fields = attributes || [];
// Extract the fields to update from the edit data
const updateData: any = {};
fields.forEach(field => {
fields.forEach((field: AttributeDefinition) => {
if (field.editable !== false) {
const value = (updatedData as any)[field.key];
const fieldName = field.name;
const value = (updatedData as any)[fieldName];
if (value !== undefined) {
updateData[field.key] = value;
updateData[fieldName] = value;
}
}
});

View file

@ -30,9 +30,9 @@ export function PlayActionButton<T = any>({
loading = false,
className = '',
title,
hookData,
hookData: _hookData,
idField = 'id',
nameField = 'name',
nameField: _nameField = 'name',
contentField = 'content',
navigateTo = 'start/dashboard',
mode = 'prompt'
@ -55,13 +55,26 @@ export function PlayActionButton<T = any>({
}
if (mode === 'workflow') {
// Workflow mode: select workflow and navigate
// Workflow mode: reset workflow state and select workflow
const workflowId = (row as any)[idField];
if (!workflowId) {
console.error('Workflow ID not found in row');
return;
}
// Dispatch event to reset workflow state before selecting new one
// This ensures the dashboard resets and loads the selected workflow
window.dispatchEvent(new CustomEvent('workflowCleared', {
detail: { workflowId: null }
}));
// Select the workflow in context (this will trigger sync in dashboard)
selectWorkflow(workflowId);
// Also dispatch workflowSelected event for any other listeners
window.dispatchEvent(new CustomEvent('workflowSelected', {
detail: { workflowId }
}));
} else {
// Prompt mode: set input value in dashboard
const content = (row as any)[contentField];

View file

@ -4,12 +4,14 @@ import styles from './FormGeneratorControls.module.css';
import { Button } from '../../UiComponents/Button';
import { IoIosRefresh } from "react-icons/io";
import { FaTrash } from "react-icons/fa";
import { isCheckboxType } from '../../../utils/attributeTypeMapper';
import type { AttributeType } from '../../../utils/attributeTypeMapper';
// Generic field/column config interface
export interface FilterableField {
key: string;
label: string;
type?: 'string' | 'number' | 'date' | 'boolean' | 'enum' | 'readonly';
type?: AttributeType;
filterable?: boolean;
filterOptions?: string[];
}
@ -62,7 +64,7 @@ export function FormGeneratorControls({
filterFocused,
onFilterFocus,
selectedCount,
displayData,
displayData: _displayData,
onDeleteSingle,
onDeleteMultiple,
onRefresh,
@ -215,7 +217,7 @@ export function FormGeneratorControls({
<div className={styles.filtersContainer}>
{filterableFields.map(field => (
<div key={field.key} className={styles.filterGroup}>
{field.type === 'boolean' ? (
{field.type && isCheckboxType(field.type) ? (
<div className={styles.customSelectContainer}>
<select
value={filters[field.key] || ''}

View file

@ -2,13 +2,23 @@ import React, { useState, useEffect, useCallback } from 'react';
import { useLanguage } from '../../../providers/language/LanguageContext';
import api from '../../../api';
import styles from './FormGeneratorForm.module.css';
import {
attributeTypeToInputType,
isTextareaType,
isSelectType,
isMultiselectType,
isCheckboxType,
isFileType,
isNumberType,
isDateTimeType,
getDefaultValueForType
} from '../../../utils/attributeTypeMapper';
import type { AttributeType } from '../../../utils/attributeTypeMapper';
// Attribute definition interface (matches backend structure)
// Attribute definition interface (matches backend structure)
export interface AttributeDefinition {
name: string;
type: 'text' | 'email' | 'date' | 'checkbox' | 'select' | 'multiselect' | 'number' | 'textarea' |
'timestamp' | 'time' | 'url' | 'password' | 'file' | 'integer' | 'float' | 'string' |
'boolean' | 'enum' | 'readonly';
type: AttributeType;
label: string;
description?: string;
required?: boolean;
@ -177,12 +187,8 @@ export function FormGeneratorForm<T extends Record<string, any>>({
filteredAttrs.forEach(attr => {
if (attr.default !== undefined) {
initialData[attr.name] = attr.default;
} else if (attr.type === 'checkbox' || attr.type === 'boolean') {
initialData[attr.name] = false;
} else if (attr.type === 'multiselect') {
initialData[attr.name] = [];
} else {
initialData[attr.name] = '';
initialData[attr.name] = getDefaultValueForType(attr.type);
}
});
setFormData(initialData as T);
@ -332,7 +338,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Number/Float validation
if (attr.type === 'number' || attr.type === 'float') {
if (isNumberType(attr.type)) {
if (isNaN(Number(value))) {
newErrors[attr.name] = t('formgen.form.invalidNumber', `${attr.label} must be a valid number`);
return;
@ -358,7 +364,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Select/Multiselect option validation
if (attr.type === 'select' || attr.type === 'enum') {
if (isSelectType(attr.type)) {
const options = normalizeOptions(attr);
if (options.length > 0 && !options.some(opt => String(opt.value) === String(value))) {
newErrors[attr.name] = t('formgen.form.invalidOption', 'Invalid option selected');
@ -367,7 +373,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Timestamp/Date validation
if (attr.type === 'timestamp' || attr.type === 'date' || attr.type === 'time') {
if (isDateTimeType(attr.type)) {
const dateValue = new Date(String(value));
if (isNaN(dateValue.getTime())) {
newErrors[attr.name] = t('formgen.form.invalidDate', 'Invalid date format');
@ -451,13 +457,13 @@ export function FormGeneratorForm<T extends Record<string, any>>({
// Readonly/Display field
if (isReadonly) {
let displayValue = value;
if (attr.type === 'checkbox' || attr.type === 'boolean') {
if (isCheckboxType(attr.type)) {
displayValue = value ? t('common.yes', 'Yes') : t('common.no', 'No');
} else if (attr.type === 'select' || attr.type === 'enum') {
} else if (isSelectType(attr.type)) {
const options = normalizeOptions(attr);
const selectedOption = options.find(opt => String(opt.value) === String(value));
displayValue = selectedOption ? selectedOption.label : value;
} else if (attr.type === 'multiselect') {
} else if (isMultiselectType(attr.type)) {
const options = normalizeOptions(attr);
const selectedValues = Array.isArray(value) ? value : (value ? [value] : []);
displayValue = selectedValues.map(v => {
@ -479,7 +485,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Select/Enum field
if (attr.type === 'select' || attr.type === 'enum') {
if (isSelectType(attr.type)) {
const options = normalizeOptions(attr);
const isLoading = typeof attr.options === 'string' && loadingOptions[attr.name];
@ -508,7 +514,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Multiselect field
if (attr.type === 'multiselect') {
if (isMultiselectType(attr.type)) {
const options = normalizeOptions(attr);
const currentValues = Array.isArray(value) ? value : (value ? [value] : []);
const isLoading = typeof attr.options === 'string' && loadingOptions[attr.name];
@ -562,7 +568,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Checkbox/Boolean field
if (attr.type === 'checkbox' || attr.type === 'boolean') {
if (isCheckboxType(attr.type)) {
return (
<div className={styles.fieldGroup} key={attr.name}>
<label className={styles.checkboxLabel}>
@ -583,7 +589,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Textarea field
if (attr.type === 'textarea') {
if (isTextareaType(attr.type)) {
const minRows = attr.minRows || 4;
const maxRows = attr.maxRows || 8;
const minHeight = minRows * 1.5 * 16;
@ -635,7 +641,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// File field
if (attr.type === 'file') {
if (isFileType(attr.type)) {
return (
<div className={styles.floatingLabelInput} key={attr.name}>
<input
@ -658,14 +664,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
}
// Default input field (text, email, date, time, url, password, number, integer, float)
const inputType = attr.type === 'email' ? 'email' :
attr.type === 'date' ? 'date' :
attr.type === 'time' ? 'time' :
attr.type === 'timestamp' ? 'datetime-local' :
attr.type === 'url' ? 'url' :
attr.type === 'password' ? 'password' :
(attr.type === 'number' || attr.type === 'integer' || attr.type === 'float') ? 'number' :
'text';
const inputType = attributeTypeToInputType(attr.type);
return (
<div className={styles.floatingLabelInput} key={attr.name}>
@ -674,7 +673,7 @@ export function FormGeneratorForm<T extends Record<string, any>>({
value={value || ''}
onChange={(e) => {
let newValue: any = e.target.value;
if (attr.type === 'number' || attr.type === 'integer' || attr.type === 'float') {
if (isNumberType(attr.type)) {
newValue = e.target.value === '' ? '' : Number(e.target.value);
}
handleFieldChange(attr.name, newValue);

View file

@ -13,12 +13,18 @@ import {
import { formatUnixTimestamp } from '../../../utils/time';
import TextField from '../../UiComponents/TextField/TextField';
import { FormGeneratorControls } from '../FormGeneratorControls';
import {
isSelectType,
isCheckboxType,
attributeTypeToInputType
} from '../../../utils/attributeTypeMapper';
import type { AttributeType } from '../../../utils/attributeTypeMapper';
// Types for the FormGeneratorList
export interface FieldConfig {
key: string;
label: string;
type?: 'string' | 'number' | 'date' | 'boolean' | 'enum' | 'readonly';
type?: AttributeType;
editable?: boolean;
required?: boolean;
formatter?: (value: any, row: any) => React.ReactNode;
@ -447,7 +453,7 @@ export function FormGeneratorList<T extends Record<string, any>>({
};
// Render field input
const renderFieldInput = (field: FieldConfig, value: any, row: T, index: number) => {
const renderFieldInput = (field: FieldConfig, value: any, row: T, _index: number) => {
if (field.type === 'readonly' || !field.editable) {
return (
<div className={styles.fieldValue} key={field.key}>
@ -456,7 +462,7 @@ export function FormGeneratorList<T extends Record<string, any>>({
);
}
if (field.type === 'enum' && field.options) {
if (isSelectType(field.type || 'string') && field.options) {
return (
<select
key={field.key}
@ -472,7 +478,7 @@ export function FormGeneratorList<T extends Record<string, any>>({
);
}
if (field.type === 'boolean') {
if (isCheckboxType(field.type || 'string')) {
return (
<input
key={field.key}
@ -485,12 +491,15 @@ export function FormGeneratorList<T extends Record<string, any>>({
}
// Default to text input
const inputType = attributeTypeToInputType(field.type || 'string');
// TextField doesn't support 'textarea' type, use 'text' instead
const textFieldType = inputType === 'textarea' ? 'text' : inputType;
return (
<TextField
key={field.key}
value={value || ''}
onChange={(newValue) => onFieldChange?.(row, field.key, newValue)}
type={field.type === 'date' ? 'date' : field.type === 'number' ? 'number' : 'text'}
type={textFieldType as 'text' | 'email' | 'url' | 'password' | 'search' | 'tel' | 'number'}
required={field.required}
readonly={!field.editable}
className={styles.fieldInput}

View file

@ -13,12 +13,16 @@ import {
import { formatUnixTimestamp } from '../../../utils/time';
import { FormGeneratorControls } from '../FormGeneratorControls';
import { CopyableTruncatedValue } from '../../UiComponents/CopyableTruncatedValue';
import {
isDateTimeType
} from '../../../utils/attributeTypeMapper';
import type { AttributeType } from '../../../utils/attributeTypeMapper';
// Types for the FormGeneratorTable
export interface ColumnConfig {
key: string;
label: string;
type?: 'string' | 'number' | 'date' | 'boolean' | 'enum';
type?: AttributeType;
width?: number;
minWidth?: number;
maxWidth?: number;
@ -526,7 +530,8 @@ export function FormGeneratorTable<T extends Record<string, any>>({
const isLikelyTimestamp = typeof value === 'number' && value > 0 && value < 4102444800000;
// If it's a timestamp field or looks like a timestamp, format as date
if ((isTimestampField || isLikelyTimestamp) && typeof value === 'number') {
// Also check if column type is a date/time type
if ((isTimestampField || isLikelyTimestamp || (column.type && isDateTimeType(column.type))) && typeof value === 'number') {
try {
// Handle Unix timestamps in seconds (backend format)
let timestamp: number;
@ -557,6 +562,8 @@ export function FormGeneratorTable<T extends Record<string, any>>({
switch (column.type) {
case 'date':
case 'timestamp':
case 'time':
try {
// Handle Unix timestamps in seconds (backend format)
let timestamp: number;

View file

@ -1,18 +1,12 @@
// Legacy export - FormGenerator is now FormGeneratorTable (for backward compatibility)
export { FormGeneratorTable as FormGenerator } from './FormGeneratorTable';
export type { ColumnConfig, FormGeneratorTableProps as FormGeneratorProps } from './FormGeneratorTable';
// Re-export FormGenerator components
export * from './FormGeneratorTable';
export * from './FormGeneratorList';
export * from './FormGeneratorForm';
export * from './FormGeneratorControls';
export { FormGeneratorTable } from './FormGeneratorTable';
export type { ColumnConfig, FormGeneratorTableProps } from './FormGeneratorTable';
export { FormGeneratorList } from './FormGeneratorList';
export type { FieldConfig, FormGeneratorListProps } from './FormGeneratorList';
export { FormGeneratorControls } from './FormGeneratorControls';
export type { FilterableField, FormGeneratorControlsProps } from './FormGeneratorControls';
export { FormGeneratorForm } from './FormGeneratorForm';
export type { FormGeneratorFormProps, AttributeDefinition, AttributeOption } from './FormGeneratorForm';
// Alias FormGeneratorTable as FormGenerator for backward compatibility
export { FormGeneratorTable as FormGenerator, FormGeneratorTableComponent as FormGeneratorComponent } from './FormGeneratorTable';
export type { FormGeneratorTableProps as FormGeneratorProps, ColumnConfig } from './FormGeneratorTable';
// Re-export action button components and types
export * from './ActionButtons';

View file

@ -32,7 +32,6 @@ const SidebarItem: React.FC<SidebarItemProps> = React.memo(({
// Get the actual color from parent li element
const parentLi = wrapper.closest('li');
const parentColor = parentLi ? window.getComputedStyle(parentLi).color : '#000000';
// Force color directly - use black for now to ensure visibility
const iconColor = '#000000'; // Force black for visibility
@ -218,7 +217,6 @@ const SidebarItem: React.FC<SidebarItemProps> = React.memo(({
>
<Icon
className={`${styles.icon} ${styles.iconMinimized} ${isDisabled ? styles.disabledIcon : ''}`}
size={25}
style={{
width: '25px',
height: '25px',

View file

@ -53,7 +53,6 @@ const SidebarSubmenu: React.FC<SidebarSubmenuProps> = ({ item, isOpen, isMinimiz
{SubIcon && (
<SubIcon
className={styles.submenuHorizontalIcon}
size={16}
style={{
width: '16px',
height: '16px',

View file

@ -64,7 +64,7 @@ export function ConnectedFilesList({
deletingFiles = new Set(),
previewingFiles = new Set(),
removingFiles = new Set(),
workflowId,
workflowId: _workflowId,
emptyMessage = 'No files connected to this workflow'
}: ConnectedFilesListProps) {
// Combine workflow files and pending files, deduplicating by fileId
@ -98,7 +98,7 @@ export function ConnectedFilesList({
}
return false;
},
removeOptimistically: (fileId: string) => {
removeOptimistically: (_fileId: string) => {
// This will be handled by the parent component's state
},
refetch: async () => {
@ -121,7 +121,7 @@ export function ConnectedFilesList({
// View button (always shown)
buttons.push({
type: 'view',
onAction: async (file: WorkflowFile) => {
onAction: async (_file: WorkflowFile) => {
// View is handled by ViewActionButton's FilePreview component
return Promise.resolve();
},
@ -156,7 +156,7 @@ export function ConnectedFilesList({
return buttons;
}, [actionButtons, onDelete, onRemove]);
const handleView = async (file: WorkflowFile) => {
const handleView = async (_file: WorkflowFile) => {
// View is handled by ViewActionButton's FilePreview component
return Promise.resolve();
};
@ -187,10 +187,10 @@ export function ConnectedFilesList({
<div className={styles.fileList}>
{allFiles
.filter(file => file.fileId && file.fileId.trim() !== '') // Ensure fileId exists
.map((file, index) => {
const isDeleting = deletingFiles.has(file.fileId!);
const isPreviewing = previewingFiles.has(file.fileId!);
const isRemoving = removingFiles.has(file.fileId!);
.map((file) => {
// const isDeleting = deletingFiles.has(file.fileId!);
// const isPreviewing = previewingFiles.has(file.fileId!);
// const isRemoving = removingFiles.has(file.fileId!);
// Use fileId as key since we've filtered out files without it
const uniqueKey = file.fileId!;

View file

@ -12,7 +12,10 @@
padding: 16px 20px;
display: flex;
flex-direction: column;
gap: 16px;
min-width: 0;
overflow-x: hidden;
box-sizing: border-box;
max-width: 100%;
}
.emptyState {
@ -26,47 +29,130 @@
justify-content: center;
}
/* Round Group */
.roundGroup {
/* Dashboard Tree Styles */
.dashboardSection {
display: flex;
flex-direction: column;
gap: 12px;
margin-bottom: 16px;
min-width: 0;
max-width: 100%;
box-sizing: border-box;
overflow-x: hidden;
}
.roundHeader {
.dashboardContainer {
display: flex;
flex-direction: column;
gap: 8px;
padding: 12px 16px;
min-width: 0;
overflow-x: hidden;
box-sizing: border-box;
max-width: 100%;
}
.dashboardContainer > .operationNode {
min-width: 0;
max-width: 100%;
}
.operationNode {
display: flex;
flex-direction: column;
gap: 0;
position: relative;
min-width: 0;
box-sizing: border-box;
}
.operationNodeIndented {
border-left: 2px solid var(--color-border, #e0e0e0);
position: relative;
}
.operationNodeIndented::before {
content: '';
position: absolute;
left: -1px;
top: 0;
bottom: 0;
width: 2px;
background-color: var(--color-border, #e0e0e0);
opacity: 0.6;
}
.operationRow {
display: flex !important;
flex-direction: row !important;
align-items: flex-start;
min-height: 32px;
min-width: 0;
box-sizing: border-box;
}
.operationContent {
flex: 1;
display: flex !important;
flex-direction: column !important;
min-width: 0;
box-sizing: border-box;
}
.operationHeader {
display: flex !important;
flex-direction: column !important;
padding: 6px 12px;
background-color: var(--color-surface);
border: 1px solid var(--color-border);
border-radius: var(--object-radius-small);
transition: background-color 0.2s ease, border-color 0.2s ease;
}
.roundHeader.clickable {
cursor: pointer;
user-select: none;
}
.roundHeader.clickable:hover {
background-color: var(--color-highlight-gray);
border-color: var(--color-primary);
}
.roundHeaderLabel {
display: flex;
justify-content: space-between;
align-items: center;
font-size: 13px;
font-weight: 600;
transition: background-color 0.2s ease, border-color 0.2s ease;
min-height: 32px;
flex: 1;
min-width: 0;
overflow: hidden;
margin-top: 4px;
box-sizing: border-box;
}
.operationHeaderRow {
display: flex !important;
flex-direction: row !important;
align-items: center !important;
gap: 4px;
min-width: 0;
box-sizing: border-box;
}
.expandButton {
background: none;
border: none;
cursor: pointer;
padding: 4px;
display: flex;
align-items: center;
justify-content: center;
color: var(--color-gray);
transition: color 0.2s ease;
width: 20px;
height: 20px;
flex-shrink: 0;
}
.expandButton:hover {
color: var(--color-text);
}
.collapseIcon {
.expandButtonSpacer {
width: 20px;
display: inline-block;
flex-shrink: 0;
}
.collapseIcon {
font-size: 10px;
color: var(--color-gray);
display: inline-block;
transition: transform 0.2s ease;
}
@ -74,29 +160,241 @@
transform: rotate(-90deg);
}
.roundLogs {
display: flex;
flex-direction: column;
gap: 8px;
padding-left: 16px;
.operationIcon {
font-size: 12px;
font-weight: bold;
display: inline-flex;
align-items: center;
justify-content: center;
width: 20px;
height: 20px;
border-radius: 50%;
background-color: var(--color-primary, #007bff);
color: white;
flex-shrink: 0;
}
/* Dark theme support */
[data-theme="dark"] .roundHeader {
.operationIcon[data-type="success"] {
background-color: var(--color-success, #28a745);
color: white;
}
.operationIcon[data-type="error"] {
background-color: var(--color-error, #dc3545);
color: white;
}
.operationIcon[data-type="warning"] {
background-color: var(--color-warning, #ffc107);
color: var(--color-text);
}
.operationIcon[data-type="info"] {
background-color: var(--color-primary, #007bff);
color: white;
}
.operationName {
flex: 1;
color: var(--color-text);
font-weight: 500;
min-width: 0;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
max-width: 100%;
}
.statusMessageTag {
font-size: 11px;
color: var(--color-gray);
font-weight: 400;
padding: 2px 6px;
background-color: var(--color-highlight-gray);
border-radius: 4px;
white-space: nowrap;
flex-shrink: 0;
margin-left: 8px;
}
.operationTimestamp {
font-size: 11px;
color: var(--color-gray);
font-weight: 400;
font-family: monospace;
white-space: nowrap;
flex-shrink: 0;
margin-right: 8px;
}
.statusBadge {
padding: 2px 8px;
border-radius: 12px;
font-size: 11px;
font-weight: 600;
text-transform: uppercase;
background-color: var(--color-highlight-gray);
color: var(--color-text);
white-space: nowrap;
flex-shrink: 0;
}
.statusBadge.statusCompleted {
background-color: var(--color-success, #28a745);
color: white;
}
.statusBadge.statusFailed {
background-color: var(--color-error, #dc3545);
color: white;
}
.statusBadge.statusRunning {
background-color: var(--color-primary, #007bff);
color: white;
}
.progressPercentage {
font-size: 11px;
color: var(--color-gray);
font-weight: 600;
min-width: 45px;
text-align: right;
flex-shrink: 0;
}
.progressBarContainer {
height: 4px;
background-color: var(--color-highlight-gray);
border-radius: 2px;
overflow: hidden;
width: 100%;
}
.progressBar {
height: 100%;
background-color: var(--color-primary, #007bff);
transition: width 0.3s ease;
}
.progressBar.progressCompleted {
background-color: var(--color-success, #28a745);
}
.operationChildren {
display: flex;
flex-direction: column;
gap: 0;
position: relative;
}
/* Log messages container */
.operationLogsContainer {
display: flex;
flex-direction: column;
min-width: 0;
box-sizing: border-box;
}
.operationLogsList {
display: flex;
flex-direction: column;
padding-left: 0;
border-left: 1px solid var(--color-border);
margin-left: 30px; /* Align with header content: operationNode paddingLeft (12px) + header padding (12px) */
min-width: 0;
box-sizing: border-box;
}
.logEntry {
display: flex;
flex-direction: column;
padding: 6px 8px;
background-color: var(--color-surface);
border-radius: var(--object-radius-small);
border: 1px solid var(--color-border);
min-width: 0;
box-sizing: border-box;
margin-top: 4px;
}
.logEntryHeader {
display: flex;
align-items: center;
font-size: 11px;
flex-wrap: wrap;
min-width: 0;
box-sizing: border-box;
}
.logTimestamp {
color: var(--color-gray);
font-weight: 500;
font-family: monospace;
flex-shrink: 0;
}
.logEntryMessage {
font-size: 13px;
color: var(--color-text);
line-height: 1.4;
word-wrap: break-word;
word-break: break-word;
overflow-wrap: break-word;
flex: 1;
min-width: 0;
box-sizing: border-box;
}
.logProgress {
font-size: 10px;
color: var(--color-gray);
font-weight: 600;
flex-shrink: 0;
}
/* Dark theme support for log entries */
[data-theme="dark"] .logEntry {
background-color: var(--color-surface-dark);
border-color: var(--color-border-dark);
}
[data-theme="dark"] .roundHeader.clickable:hover {
background-color: rgba(255, 255, 255, 0.05);
border-color: var(--color-primary);
}
[data-theme="dark"] .roundHeaderLabel {
color: var(--color-text-dark);
}
[data-theme="dark"] .collapseIcon {
[data-theme="dark"] .logTimestamp {
color: var(--color-gray-dark);
}
[data-theme="dark"] .logEntryMessage {
color: var(--color-text-dark);
}
[data-theme="dark"] .operationLogsList {
border-left-color: var(--color-border-dark);
}
/* Dark theme support for dashboard */
[data-theme="dark"] .operationHeader {
background-color: var(--color-surface-dark);
border-color: var(--color-border-dark);
}
[data-theme="dark"] .operationName {
color: var(--color-text-dark);
}
[data-theme="dark"] .statusBadge {
background-color: rgba(255, 255, 255, 0.1);
color: var(--color-text-dark);
}
[data-theme="dark"] .progressPercentage {
color: var(--color-gray-dark);
}
[data-theme="dark"] .progressBarContainer {
background-color: rgba(255, 255, 255, 0.1);
}
[data-theme="dark"] .operationNode[data-depth] {
border-left-color: var(--color-border-dark);
}

View file

@ -1,115 +1,263 @@
import React, { useMemo, useState, useEffect } from 'react';
import { LogProps, RoundGroup } from './LogTypes';
import { formatUnixTimestamp } from '../../../utils/time';
import React from 'react';
import { LogProps } from './LogTypes';
import { AutoScroll } from '../AutoScroll';
import { LogMessage } from './LogMessage/LogMessage';
import { formatUnixTimestamp } from '../../../utils/time';
import styles from './Log.module.css';
// Helper function to group logs by round
const groupLogsByRound = (logs: any[]): RoundGroup[] => {
const roundMap = new Map<number, RoundGroup>();
let currentRound = 1; // Track current round
// Sort logs chronologically first
const sortedLogs = [...logs].sort((a, b) => (a.timestamp || 0) - (b.timestamp || 0));
sortedLogs.forEach((log) => {
const message = (log.message || '').toLowerCase();
// Check if this is a workflow status message that indicates a round change
if (message.includes('workflow started') || message.includes('workflow resumed')) {
const roundMatch = message.match(/\(?round\s+(\d+)\)?/i);
if (roundMatch) {
currentRound = parseInt(roundMatch[1], 10);
} else if (message.includes('workflow started')) {
// If started without round number, assume round 1
currentRound = 1;
}
// If resumed without round number, keep current round
}
// Assign log to current round
const roundNumber = currentRound;
if (!roundMap.has(roundNumber)) {
roundMap.set(roundNumber, {
round: roundNumber,
logs: [],
latestProgress: undefined,
latestTimestamp: 0
});
}
const roundGroup = roundMap.get(roundNumber)!;
roundGroup.logs.push(log);
// Update latest progress and timestamp
if (log.progress !== undefined && log.progress !== null) {
if (roundGroup.latestProgress === undefined || log.progress > roundGroup.latestProgress) {
roundGroup.latestProgress = log.progress;
}
}
if ((log.timestamp || 0) > roundGroup.latestTimestamp) {
roundGroup.latestTimestamp = log.timestamp || 0;
}
});
// Sort rounds and logs within each round
return Array.from(roundMap.values())
.sort((a, b) => a.round - b.round)
.map(roundGroup => ({
...roundGroup,
logs: roundGroup.logs.sort((a, b) => (a.timestamp || 0) - (b.timestamp || 0))
}));
// Helper to get status badge class
const getStatusBadgeClass = (status?: string | null): string => {
if (!status) return styles.statusBadge;
switch (status.toLowerCase()) {
case 'completed':
return `${styles.statusBadge} ${styles.statusCompleted}`;
case 'failed':
case 'error':
return `${styles.statusBadge} ${styles.statusFailed}`;
case 'running':
return `${styles.statusBadge} ${styles.statusRunning}`;
default:
return styles.statusBadge;
}
};
const Log: React.FC<LogProps> = ({
className = '',
emptyMessage = 'No log information available',
logs = []
dashboardTree,
onToggleOperationExpanded,
getChildOperations
}) => {
// Group logs by round
const roundGroups = useMemo(() => groupLogsByRound(logs), [logs]);
// Get the latest round number
const latestRound = roundGroups.length > 0 ? roundGroups[roundGroups.length - 1].round : null;
// State to track collapsed rounds (round number -> isCollapsed)
const [collapsedRounds, setCollapsedRounds] = useState<Set<number>>(new Set());
// Initialize collapsed state: collapse all rounds except the latest one
useEffect(() => {
if (roundGroups.length > 0 && latestRound !== null) {
setCollapsedRounds(prev => {
const newSet = new Set(prev);
// Ensure latest round is not collapsed
newSet.delete(latestRound);
// Collapse all other rounds that aren't already in the set
roundGroups.forEach(rg => {
if (rg.round !== latestRound && !newSet.has(rg.round)) {
newSet.add(rg.round);
}
});
return newSet;
const formatLogTimestamp = (timestamp: number): string => {
try {
const formatted = formatUnixTimestamp(timestamp, undefined, {
year: 'numeric',
month: '2-digit',
day: '2-digit',
hour: '2-digit',
minute: '2-digit',
second: '2-digit',
hour12: false
});
return formatted.time;
} catch {
return new Date(timestamp * 1000).toLocaleString();
}
}, [roundGroups.length, latestRound]); // Only update when rounds change, not on every log update
// Toggle collapse state for a round
const toggleRoundCollapse = (round: number) => {
setCollapsedRounds(prev => {
const newSet = new Set(prev);
if (newSet.has(round)) {
newSet.delete(round);
} else {
newSet.add(round);
}
return newSet;
};
// Render operation node recursively
const renderOperationNode = (operationId: string, depth: number = 0): React.ReactNode => {
if (!dashboardTree || !getChildOperations) {
return null;
}
const operation = dashboardTree.operations.get(operationId);
if (!operation) {
return null;
}
// Get logs for this operation, sorted by timestamp
const logsArray = Array.from(operation.logs.values()).sort((a, b) => {
const tsA = a.timestamp || 0;
const tsB = b.timestamp || 0;
return tsA - tsB; // Ascending order (oldest first)
});
// Get latest log for timestamp
const latestLog = logsArray.length > 0 ? logsArray[logsArray.length - 1] : null;
// Skip rendering if no logs yet
if (logsArray.length === 0) {
return null;
}
// Get child operations
const childOperations = getChildOperations(operationId);
const hasChildren = childOperations.length > 0;
const hasLogs = logsArray.length > 0;
const hasContentToExpand = hasChildren || hasLogs;
// Calculate progress percentage
let progressPercentage = 0;
if (operation.latestProgress !== null && operation.latestProgress !== undefined) {
progressPercentage = Math.min(Math.max(operation.latestProgress * 100, 0), 100);
}
// Force 100% progress when status is 'completed'
if (operation.latestStatus === 'completed') {
progressPercentage = 100;
}
// Use stable operation name (from first log) or fallback to operationId
const operationName = operation.operationName || `Operation ${operationId}`;
// Use latest message as status tag (updates with each poll)
const latestMessage = operation.latestMessage || '';
const operationStatus = operation.latestStatus || 'running';
const operationTimestamp = latestLog?.timestamp;
// Calculate consistent indentation per level (24px per level)
const hasIndent = depth > 0;
// Calculate log entry indentation to align with operation name
// Operation name starts at: header padding-left (12px) + button/spacer (20px) + gap (8px) = 40px from operationContent
// operationLogsList has margin-left: 12px (for border), so log entries are at: container marginLeft + 12px
// We want log entries at 40px from operationContent, so: container marginLeft + 12px = 40px
// Therefore: container marginLeft = 28px from operationContent
// But operationNode has paddingLeft: 12px for indented nodes, 0 for root
// So from operationNode: container marginLeft = 28px - operationNode.paddingLeft
// Root: 28px - 0 = 28px
// Indented: 28px - 12px = 16px
const logIndentPx = hasIndent ? 16 : 28;
// Calculate header indentation to match message indentation
// Headers are inside operationNode which has paddingLeft: 12px (for indented)
// Messages container has marginLeft: logIndentPx from operationNode, and list has margin-left: 12px
// So messages start at: logIndentPx + 12px from operationNode's left edge
// Headers start at: operationNode marginLeft + paddingLeft = headerIndentPx + 12px
// To align: headerIndentPx + 12px = logIndentPx + 12px, so headerIndentPx = logIndentPx
// But headers have their own padding (12px), so header content starts at headerIndentPx + 12px + 12px
// Messages start at logIndentPx + 12px, so we need headerIndentPx = logIndentPx - 12px to align content
// Actually, we want the header box to align with messages, so headerIndentPx should account for header padding
const headerIndentPx = logIndentPx; // Headers and messages both use logIndentPx, paddingLeft handles alignment
return (
<div
key={operationId}
className={`${styles.operationNode} ${hasIndent ? styles.operationNodeIndented : ''}`}
style={{
marginLeft: `${headerIndentPx}px`,
paddingLeft: hasIndent ? '12px' : '0',
position: 'relative'
}}
>
<div className={styles.operationRow}>
{/* Operation content */}
<div className={styles.operationContent}>
<div className={styles.operationHeader}>
<div className={styles.operationHeaderRow}>
{hasContentToExpand && (
<button
className={styles.expandButton}
onClick={() => onToggleOperationExpanded?.(operationId)}
aria-label={operation.expanded ? 'Collapse' : 'Expand'}
>
<span className={`${styles.collapseIcon} ${operation.expanded ? '' : styles.collapsed}`}>
</span>
</button>
)}
{!hasContentToExpand && <span className={styles.expandButtonSpacer} />}
<span className={styles.operationName}>{operationName}</span>
{/* Latest status message tag (updates with each poll) */}
{latestMessage && (
<span className={styles.statusMessageTag}>
{latestMessage}
</span>
)}
{operationTimestamp && (
<span className={styles.operationTimestamp}>
{formatLogTimestamp(operationTimestamp)}
</span>
)}
<span className={getStatusBadgeClass(operationStatus)}>
{operationStatus}
</span>
{progressPercentage > 0 && (
<span className={styles.progressPercentage}>
{Math.round(progressPercentage)}%
</span>
)}
</div>
{progressPercentage > 0 && (
<div className={styles.progressBarContainer}>
<div
className={`${styles.progressBar} ${progressPercentage >= 100 ? styles.progressCompleted : ''}`}
style={{ width: `${progressPercentage}%` }}
/>
</div>
)}
</div>
</div>
</div>
{/* Show logs and children when expanded */}
{operation.expanded && (
<>
{/* Log messages for this operation - show only latest log */}
{latestLog && (
<div
className={styles.operationLogsContainer}
style={{
// Messages should align with header content (not header box edge)
// Header content starts at: operationNode paddingLeft (12px) + header padding (12px) = 24px from operationNode left
// Messages should start at the same position: 0 marginLeft (since operationNode already has paddingLeft)
marginLeft: '0px'
}}
>
<div className={styles.operationLogsList}>
<div key={`log-${operationId}-latest`} className={styles.logEntry}>
<div className={styles.logEntryHeader}>
<span className={styles.logTimestamp}>
{formatLogTimestamp(latestLog.timestamp)}
</span>
<span className={styles.logEntryMessage}>
{latestLog.message}
</span>
{latestLog.status && (
<span className={getStatusBadgeClass(latestLog.status)}>
{latestLog.status}
</span>
)}
{latestLog.progress !== undefined && latestLog.progress !== null && (
<span className={styles.logProgress}>
{Math.round(latestLog.progress * 100)}%
</span>
)}
</div>
</div>
</div>
</div>
)}
{/* Child operations */}
{hasChildren && (
<div className={styles.operationChildren}>
{childOperations.map((childOpId) => renderOperationNode(childOpId, depth + 1))}
</div>
)}
</>
)}
</div>
);
};
if (logs.length === 0) {
// Render dashboard tree
const renderDashboard = (): React.ReactNode => {
if (!dashboardTree || !getChildOperations) {
return null;
}
if (dashboardTree.rootOperations.length === 0) {
return (
<div className={styles.emptyState}>{emptyMessage}</div>
);
}
return (
<div className={styles.dashboardContainer}>
{dashboardTree.rootOperations.map((rootOpId) => renderOperationNode(rootOpId, 0))}
</div>
);
};
// Check if we have dashboard logs to display
const hasDashboardLogs = dashboardTree && dashboardTree.rootOperations.length > 0;
if (!hasDashboardLogs) {
return (
<div className={`${styles.logContainer} ${className}`}>
<div className={styles.emptyState}>{emptyMessage}</div>
@ -119,65 +267,11 @@ const Log: React.FC<LogProps> = ({
return (
<div className={`${styles.logContainer} ${className}`}>
{/* Scrollable Content Section - All Rounds in Chronological Order */}
<AutoScroll
scrollDependency={logs.length}
>
<AutoScroll scrollDependency={dashboardTree.rootOperations.length}>
<div className={styles.scrollableContent}>
{/* All Round Groups - In Chronological Order (Oldest First, Latest Last) */}
{roundGroups.map((roundGroup) => {
const isCollapsed = collapsedRounds.has(roundGroup.round);
return (
<div key={`round-${roundGroup.round}`} className={styles.roundGroup}>
{/* Round Header - Clickable */}
{roundGroup.logs.length > 0 && (
<div
className={`${styles.roundHeader} ${styles.clickable}`}
onClick={() => toggleRoundCollapse(roundGroup.round)}
>
<div className={styles.roundHeaderLabel}>
<span>Round {roundGroup.round} Logs</span>
<span className={`${styles.collapseIcon} ${isCollapsed ? styles.collapsed : ''}`}>
</span>
</div>
</div>
)}
{/* Log Messages for this Round - Collapsible */}
{!isCollapsed && (
<div className={styles.roundLogs}>
{roundGroup.logs.map((log, index) => {
// Convert log to Message format for LogMessage component
const message = {
id: log.id || `log-${index}`,
workflowId: log.workflowId || '',
message: log.message || '',
status: log.status,
timestamp: log.timestamp,
publishedAt: log.timestamp,
sequenceNr: index,
role: 'system',
documents: undefined,
summary: undefined
};
return (
<LogMessage
key={message.id}
message={message}
showDocuments={false}
showMetadata={true}
showProgress={false}
/>
);
})}
</div>
)}
<div className={styles.dashboardSection}>
{renderDashboard()}
</div>
);
})}
</div>
</AutoScroll>
</div>
@ -185,4 +279,3 @@ const Log: React.FC<LogProps> = ({
};
export default Log;

View file

@ -1,5 +1,3 @@
import type React from 'react';
/**
* Log entry from workflow
*/
@ -17,13 +15,21 @@ export interface WorkflowLog {
}
/**
* Round group containing logs and progress
* Dashboard log tree structure
*/
export interface RoundGroup {
round: number;
logs: WorkflowLog[];
latestProgress: number | undefined;
latestTimestamp: number;
export interface DashboardLogTree {
operations: Map<string, {
logs: Map<string, WorkflowLog>;
parentId: string | null;
expanded: boolean;
latestProgress: number | null;
latestStatus: string | null;
operationName: string | null;
latestMessage: string | null;
}>;
rootOperations: string[];
logExpandedStates: Map<string, boolean>;
currentRound: number | null;
}
/**
@ -42,8 +48,18 @@ export interface LogProps {
emptyMessage?: string;
/**
* Array of log entries to display
* Dashboard log tree (logs with operationId)
*/
logs?: WorkflowLog[];
dashboardTree?: DashboardLogTree;
/**
* Callback to toggle operation expanded state
*/
onToggleOperationExpanded?: (operationId: string) => void;
/**
* Function to get child operations for a parent
*/
getChildOperations?: (parentId: string | null) => string[];
}

View file

@ -2,7 +2,7 @@ import React, { useEffect, useRef } from 'react';
import L from 'leaflet';
import 'leaflet/dist/leaflet.css';
import { lv95ToWGS84, wgs84ToLV95 } from './LV95Converter';
import type { MapPoint, ParcelGeometry, MapViewProps } from './MapView';
import type { MapViewProps } from './MapView';
import styles from './MapView.module.css';
// Fix for default marker icons in Leaflet
@ -32,7 +32,7 @@ const MapViewLeaflet: React.FC<MapViewProps> = ({
}) => {
const mapRef = useRef<L.Map | null>(null);
const mapContainerRef = useRef<HTMLDivElement>(null);
const layersRef = useRef<L.LayerGroup[]>([]);
const layersRef = useRef<L.Layer[]>([]);
const centerMarkerRef = useRef<L.Marker | null>(null);
// Initialize map

View file

@ -23,7 +23,7 @@ export interface DocumentItemProps {
*/
export const DocumentItem: React.FC<DocumentItemProps> = ({
document,
message,
message: _message,
className,
onFileDelete,
onFileRemove,
@ -31,7 +31,7 @@ export const DocumentItem: React.FC<DocumentItemProps> = ({
deletingFiles = new Set(),
previewingFiles = new Set(),
removingFiles = new Set(),
workflowId
workflowId: _workflowId
}) => {
// Convert MessageDocument to WorkflowFile format for compatibility with action buttons
const workflowFile: WorkflowFile = useMemo(() => ({
@ -50,7 +50,7 @@ export const DocumentItem: React.FC<DocumentItemProps> = ({
// Create hookData object for action buttons
const hookData = useMemo(() => ({
handleDelete: async (fileId: string) => {
handleDelete: async (_fileId: string) => {
if (onFileDelete) {
await onFileDelete(workflowFile);
return true;

View file

@ -1,56 +0,0 @@
/* ViewForm container */
.viewForm {
width: 100%;
}
/* Field styling */
.fieldGroup {
margin-bottom: 16px;
padding-bottom: 12px;
border-bottom: 1px solid #f3f4f6;
}
.fieldGroup:last-child {
border-bottom: none;
margin-bottom: 0;
}
.fieldLabel {
display: block;
font-weight: 600;
color: #374151;
margin-bottom: 6px;
font-size: 14px;
text-transform: capitalize;
}
.fieldValue {
color: #6b7280;
font-size: 14px;
line-height: 1.5;
word-break: break-word;
padding: 4px 0;
}
/* Special styling for different value types */
.fieldValue:empty::before {
content: 'N/A';
color: #9ca3af;
font-style: italic;
}
/* Responsive design */
@media (max-width: 640px) {
.fieldGroup {
margin-bottom: 12px;
padding-bottom: 8px;
}
.fieldLabel {
font-size: 13px;
}
.fieldValue {
font-size: 13px;
}
}

View file

@ -1,46 +0,0 @@
import styles from './ViewForm.module.css';
// Field configuration interface for ViewForm
export interface ViewFieldConfig {
key: string;
label: string;
formatter?: (value: any) => string;
}
// ViewForm props - for display-only purposes
export interface ViewFormProps<T = any> {
data: T;
fields: ViewFieldConfig[];
className?: string;
}
// ViewForm component - displays data in read-only format
export function ViewForm<T extends Record<string, any>>({
data,
fields,
className = ''
}: ViewFormProps<T>) {
// Render field in view-only mode
const renderField = (field: ViewFieldConfig) => {
const value = data[field.key];
return (
<div className={styles.fieldGroup} key={field.key}>
<label className={styles.fieldLabel}>{field.label}</label>
<div className={styles.fieldValue}>
{field.formatter ? field.formatter(value) : (value || 'N/A')}
</div>
</div>
);
};
return (
<div className={`${styles.viewForm} ${className}`}>
{fields.map(field => renderField(field))}
</div>
);
}
export default ViewForm;

View file

@ -4,8 +4,4 @@ export type { PopupProps, PopupAction } from './Popup';
// FormGeneratorForm component (recommended for backend-driven forms)
export { FormGeneratorForm } from '../../FormGenerator/FormGeneratorForm';
export type { FormGeneratorFormProps, AttributeDefinition, AttributeOption } from '../../FormGenerator/FormGeneratorForm';
// ViewForm component
export { ViewForm } from './ViewForm';
export type { ViewFormProps } from './ViewForm';
export type { FormGeneratorFormProps, AttributeDefinition, AttributeOption } from '../../FormGenerator/FormGeneratorForm';

View file

@ -9,6 +9,7 @@ interface TextFieldProps extends BaseTextFieldProps {
step?: string;
min?: string | number;
max?: string | number;
onKeyDown?: (e: React.KeyboardEvent<HTMLInputElement | HTMLTextAreaElement>) => void;
}
const TextField: React.FC<TextFieldProps> = ({

View file

@ -0,0 +1,43 @@
.viewForm {
display: flex;
flex-direction: column;
gap: 1.5rem;
padding: 1rem 0;
}
.fieldGroup {
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.fieldLabel {
font-weight: 600;
font-size: 0.875rem;
color: var(--text-secondary, #666);
text-transform: uppercase;
letter-spacing: 0.05em;
}
.fieldValue {
font-size: 1rem;
color: var(--text-primary, #333);
padding: 0.75rem;
background-color: var(--background-secondary, #f5f5f5);
border-radius: 4px;
min-height: 2.5rem;
display: flex;
align-items: center;
word-break: break-word;
}
/* Dark theme support */
[data-theme="dark"] .fieldLabel {
color: var(--text-secondary, #aaa);
}
[data-theme="dark"] .fieldValue {
color: var(--text-primary, #e0e0e0);
background-color: var(--background-secondary, #2a2a2a);
}

View file

@ -0,0 +1,114 @@
import styles from './ViewForm.module.css';
import {
isCheckboxType,
isSelectType,
isMultiselectType,
isDateTimeType
} from '../../../utils/attributeTypeMapper';
import type { AttributeType } from '../../../utils/attributeTypeMapper';
// Field configuration interface for ViewForm
export interface ViewFieldConfig {
key: string;
label: string;
type?: AttributeType;
formatter?: (value: any) => string;
options?: Array<{ value: string | number; label: string }>; // For select/enum types
}
// ViewForm props - for display-only purposes
export interface ViewFormProps<T = any> {
data: T;
fields: ViewFieldConfig[];
className?: string;
}
// ViewForm component - displays data in read-only format
export function ViewForm<T extends Record<string, any>>({
data,
fields,
className = ''
}: ViewFormProps<T>) {
// Format value based on field type
const formatValue = (field: ViewFieldConfig, value: any): string => {
// Use custom formatter if provided
if (field.formatter) {
return field.formatter(value);
}
// Handle null/undefined
if (value === null || value === undefined) {
return 'N/A';
}
// Type-based formatting
if (field.type) {
// Boolean/Checkbox types
if (isCheckboxType(field.type)) {
return value ? 'Yes' : 'No';
}
// Select/Enum types
if (isSelectType(field.type) && field.options) {
const option = field.options.find(opt => String(opt.value) === String(value));
return option ? option.label : String(value);
}
// Multiselect types
if (isMultiselectType(field.type) && field.options) {
const selectedValues = Array.isArray(value) ? value : (value ? [value] : []);
if (selectedValues.length === 0) {
return 'None';
}
return selectedValues.map(v => {
const option = field.options!.find(opt => String(opt.value) === String(v));
return option ? option.label : String(v);
}).join(', ');
}
// Date/Time/Timestamp types
if (isDateTimeType(field.type)) {
try {
const date = value instanceof Date ? value : new Date(value);
if (!isNaN(date.getTime())) {
return date.toLocaleString();
}
} catch {
// Fall through to default
}
}
}
// Default: convert to string
if (Array.isArray(value)) {
return value.length > 0 ? value.join(', ') : 'None';
}
return String(value);
};
// Render field in view-only mode
const renderField = (field: ViewFieldConfig) => {
const value = data[field.key];
const formattedValue = formatValue(field, value);
return (
<div className={styles.fieldGroup} key={field.key}>
<label className={styles.fieldLabel}>{field.label}</label>
<div className={styles.fieldValue}>
{formattedValue}
</div>
</div>
);
};
return (
<div className={`${styles.viewForm} ${className}`}>
{fields.map(field => renderField(field))}
</div>
);
}
export default ViewForm;

View file

@ -0,0 +1,3 @@
export { ViewForm, default as DefaultViewForm } from './ViewForm';
export type { ViewFormProps, ViewFieldConfig } from './ViewForm';

View file

@ -107,6 +107,34 @@
text-align: right;
}
.statsContainer {
display: flex;
align-items: center;
gap: 16px;
flex-wrap: wrap;
}
.statItem {
display: flex;
align-items: center;
gap: 4px;
font-size: 12px;
}
.statLabel {
font-weight: 600;
color: var(--color-text-secondary);
text-transform: uppercase;
letter-spacing: 0.5px;
font-size: 11px;
}
.statValue {
font-weight: 600;
color: var(--color-text);
font-family: 'Courier New', monospace;
}
/* Dark theme support */
[data-theme="dark"] .workflowStatusContainer {
background-color: var(--color-surface-dark);
@ -151,3 +179,11 @@
color: var(--color-text-dark);
}
[data-theme="dark"] .statLabel {
color: var(--color-text-secondary-dark);
}
[data-theme="dark"] .statValue {
color: var(--color-text-dark);
}

View file

@ -88,43 +88,28 @@ const extractWorkflowStatus = (logs: any[]): { status: WorkflowStatusType; round
};
};
// Helper function to group logs by round and get latest progress
const getLatestRoundProgress = (logs: any[]): { round: number | null; progress: number | undefined } => {
if (!logs || logs.length === 0) {
return { round: null, progress: undefined };
// Helper function to format bytes to KB or MB
const formatBytes = (bytes?: number): string => {
if (bytes === undefined || bytes === null) return '-';
if (bytes === 0) return '0 B';
const kb = bytes / 1024;
if (kb < 1024) {
return `${kb.toFixed(2)} KB`;
}
const mb = kb / 1024;
return `${mb.toFixed(2)} MB`;
};
// Find the latest round
let currentRound = 1;
let latestProgress: number | undefined = undefined;
let latestRound = 1;
// Helper function to format price
const formatPrice = (price?: number): string => {
if (price === undefined || price === null) return '-';
return `$${price.toFixed(2)}`;
};
const sortedLogs = [...logs].sort((a, b) => (a.timestamp || 0) - (b.timestamp || 0));
sortedLogs.forEach((log) => {
const message = (log.message || '').toLowerCase();
// Check if this is a workflow status message that indicates a round change
if (message.includes('workflow started') || message.includes('workflow resumed')) {
const roundMatch = message.match(/\(?round\s+(\d+)\)?/i);
if (roundMatch) {
currentRound = parseInt(roundMatch[1], 10);
latestRound = currentRound;
} else if (message.includes('workflow started')) {
currentRound = 1;
latestRound = 1;
}
}
// Update progress for current round
if (log.progress !== undefined && log.progress !== null) {
if (currentRound === latestRound) {
latestProgress = log.progress;
}
}
});
return { round: latestRound, progress: latestProgress };
// Helper function to format processing time
const formatProcessingTime = (time?: number): string => {
if (time === undefined || time === null) return '-';
return `${time.toFixed(2)}s`;
};
const WorkflowStatus: React.FC<WorkflowStatusProps> = ({
@ -132,7 +117,8 @@ const WorkflowStatus: React.FC<WorkflowStatusProps> = ({
logs = [],
workflowStatus: workflowStatusFromApi,
currentRound: currentRoundFromApi,
isRunning
isRunning,
latestStats
}) => {
// Use workflow status and round from API response, fallback to extracting from logs
const workflowStatus = useMemo(() => {
@ -173,21 +159,12 @@ const WorkflowStatus: React.FC<WorkflowStatusProps> = ({
return extractWorkflowStatus(logs);
}, [workflowStatusFromApi, currentRoundFromApi, logs]);
// Get latest round progress
const latestProgress = useMemo(() => getLatestRoundProgress(logs), [logs]);
// Determine if workflow is running (show spinner)
// Show spinner if explicitly running OR if status indicates running state
const showSpinner = isRunning === true || workflowStatus.status === 'started' || workflowStatus.status === 'resumed';
// Calculate progress percentage
const progressValue = latestProgress.progress !== undefined
? Math.min(Math.max(latestProgress.progress, 0), 1)
: undefined;
const progressPercent = progressValue !== undefined ? Math.round(progressValue * 100) : undefined;
// Don't render if no status information (but always show if spinner should be visible)
if (!showSpinner && !workflowStatus.status && workflowStatus.round === null && progressValue === undefined) {
// Don't render if no status information and no stats (but always show if spinner should be visible)
if (!showSpinner && !workflowStatus.status && workflowStatus.round === null && !latestStats) {
return null;
}
@ -208,16 +185,33 @@ const WorkflowStatus: React.FC<WorkflowStatusProps> = ({
)}
</div>
{/* Progress Bar */}
{progressValue !== undefined && (
<div className={styles.progressBarContainer}>
<div className={styles.progressBar}>
<div
className={styles.progressBarFill}
style={{ width: `${progressPercent}%` }}
/>
</div>
<div className={styles.progressBarLabel}>{progressPercent}%</div>
{/* Stats Display */}
{latestStats && (
<div className={styles.statsContainer}>
{latestStats.priceUsd !== undefined && (
<div className={styles.statItem}>
<span className={styles.statLabel}>Price:</span>
<span className={styles.statValue}>{formatPrice(latestStats.priceUsd)}</span>
</div>
)}
{latestStats.processingTime !== undefined && (
<div className={styles.statItem}>
<span className={styles.statLabel}>Time:</span>
<span className={styles.statValue}>{formatProcessingTime(latestStats.processingTime)}</span>
</div>
)}
{latestStats.bytesSent !== undefined && (
<div className={styles.statItem}>
<span className={styles.statLabel}>Sent:</span>
<span className={styles.statValue}>{formatBytes(latestStats.bytesSent)}</span>
</div>
)}
{latestStats.bytesReceived !== undefined && (
<div className={styles.statItem}>
<span className={styles.statLabel}>Received:</span>
<span className={styles.statValue}>{formatBytes(latestStats.bytesReceived)}</span>
</div>
)}
</div>
)}
</div>

View file

@ -1,5 +1,3 @@
import type React from 'react';
/**
* Log entry from workflow
*/
@ -44,6 +42,16 @@ export interface WorkflowStatusProps {
* Whether the workflow is currently running (shows spinner)
*/
isRunning?: boolean;
/**
* Latest statistics from the workflow (price, processing time, bytes sent/received)
*/
latestStats?: {
priceUsd?: number;
processingTime?: number;
bytesSent?: number;
bytesReceived?: number;
} | null;
}
export type WorkflowStatusType = 'started' | 'resumed' | 'stopped' | 'failed' | 'completed' | null;

View file

@ -11,7 +11,9 @@ export * from './MapView';
export * from './ParcelInfoPanel';
export * from './CopyableTruncatedValue';
export { Log } from './Log';
export * from './Log';
export type { LogProps } from './Log/LogTypes';
export { LogMessage } from './Log/LogMessage';
export type { LogMessageProps } from './Log/LogMessage';
export { WorkflowStatus } from './WorkflowStatus';
export * from './WorkflowStatus';
export type { WorkflowStatusProps } from './WorkflowStatus/WorkflowStatusTypes';
export * from './AutoScroll';

View file

@ -1,4 +1,4 @@
import React, { createContext, useContext, useState, useCallback, useEffect } from 'react';
import React, { createContext, useContext, useCallback } from 'react';
import { useUserFiles, useFileOperations, UserFile } from '../hooks/useFiles';
interface FileContextType {
@ -16,7 +16,7 @@ interface FileContextType {
const FileContext = createContext<FileContextType | undefined>(undefined);
export function FileProvider({ children }: { children: React.ReactNode }) {
const { data: files, loading, error, refetch: refetchFiles, removeFileOptimistically, addFileOptimistically } = useUserFiles();
const { data: files, loading, error, refetch: refetchFiles, removeFileOptimistically } = useUserFiles();
const {
handleFileUpload: hookHandleFileUpload,
handleFileDelete: hookHandleFileDelete,
@ -40,25 +40,13 @@ export function FileProvider({ children }: { children: React.ReactNode }) {
return result;
}
// Add file optimistically to the shared state
const newFile: UserFile = {
id: fileData.id,
file_name: fileData.fileName || file.name,
mime_type: fileData.mimeType || file.type || 'application/octet-stream',
action: 'Document', // Will be determined by mime type in useUserFiles
created_at: fileData.creationDate ? new Date(fileData.creationDate * 1000).toISOString() : new Date().toISOString(),
size: fileData.fileSize || file.size,
source: 'user_uploaded'
};
addFileOptimistically(newFile);
// File will be added via refetch
// Refetch to ensure we have the latest data (this will update all consumers)
await refetchFiles();
}
return result;
}, [hookHandleFileUpload, addFileOptimistically, refetchFiles]);
}, [hookHandleFileUpload, refetchFiles]);
// Centralized file delete that updates the shared state
const handleFileDelete = useCallback(async (fileId: string, onOptimisticDelete?: () => void) => {

View file

@ -1,4 +1,4 @@
import React, { createContext, useContext, useState, useCallback, ReactNode } from 'react';
import { createContext, useContext, useState, useCallback, ReactNode } from 'react';
interface WorkflowSelectionContextType {
selectedWorkflowId: string | null;

View file

@ -26,30 +26,72 @@ const PageManager: React.FC<PageManagerProps> = ({
const currentPath = getCurrentPath();
// Check if user has access to a page using RBAC
// Check if user has access to a page using backend RBAC permissions
const checkPageAccess = async (pageData: GenericPageData): Promise<boolean> => {
console.log('🔍 PageManager: Checking page access:', {
path: pageData.path,
name: pageData.name,
hide: pageData.hide,
moduleEnabled: pageData.moduleEnabled
});
try {
return await canView('UI', pageData.path);
const hasAccess = await canView('UI', pageData.path);
console.log('🔍 PageManager: Page access result:', {
path: pageData.path,
hasAccess
});
return hasAccess;
} catch (error) {
console.error(`Error checking RBAC access for ${pageData.path}:`, error);
console.error(`❌ PageManager: Error checking RBAC access for ${pageData.path}:`, error);
return false;
}
};
useEffect(() => {
console.log('🔄 PageManager: useEffect triggered for path:', currentPath);
const pageData = getPageDataByPath(currentPath);
console.log('📄 PageManager: Page data found:', {
path: currentPath,
hasPageData: !!pageData,
hide: pageData?.hide,
moduleEnabled: pageData?.moduleEnabled,
name: pageData?.name
});
if (!pageData || pageData.hide || !pageData.moduleEnabled) {
console.log('⛔ PageManager: Page not rendered:', {
path: currentPath,
reason: !pageData ? 'not found' : pageData.hide ? 'hidden' : 'module disabled'
});
return;
}
// Check page access
console.log('🔍 PageManager: Checking access before rendering:', currentPath);
checkPageAccess(pageData).then(hasAccess => {
console.log('🔍 PageManager: Access check complete:', {
path: currentPath,
hasAccess
});
if (!hasAccess) {
console.log('⛔ PageManager: Page not rendered due to access check:', currentPath);
return;
}
console.log('✅ PageManager: Rendering page:', {
path: currentPath,
name: pageData.name
});
setPageInstances(prev => {
console.log('📦 PageManager: Creating/updating page instance:', {
path: currentPath,
existingInstances: Array.from(prev.keys()),
willCreateNew: !prev.has(currentPath)
});
const newInstances = new Map(prev);
// Update active states
@ -59,6 +101,10 @@ const PageManager: React.FC<PageManagerProps> = ({
// Create instance if it doesn't exist
if (!newInstances.has(currentPath)) {
console.log('📦 PageManager: Creating new page instance:', {
path: currentPath,
name: pageData.name
});
const shouldPreserve = pageData.preserveState || false;
const pageInstance: PageInstance = {
@ -71,7 +117,7 @@ const PageManager: React.FC<PageManagerProps> = ({
) : (
<PageRenderer
pageData={pageData}
onButtonClick={(buttonId, button) => {
onButtonClick={(_buttonId, _button) => {
}}
/>
)}
@ -84,11 +130,16 @@ const PageManager: React.FC<PageManagerProps> = ({
};
newInstances.set(currentPath, pageInstance);
console.log('✅ PageManager: Page instance created:', {
path: currentPath,
totalInstances: newInstances.size,
allPaths: Array.from(newInstances.keys())
});
} else {
console.log('🔄 PageManager: Page instance already exists, updating active state:', currentPath);
if (import.meta.env.DEV) {
const instance = newInstances.get(currentPath);
const _instance = newInstances.get(currentPath);
void _instance; // Intentionally unused, for debugging purposes
}
}

View file

@ -10,6 +10,7 @@ import { DragDropOverlay } from '../../components/UiComponents/DragDropOverlay';
import { useLanguage } from '../../providers/language/LanguageContext';
import { usePermissions } from '../../hooks/usePermissions';
import { FiPaperclip } from 'react-icons/fi';
import type { WorkflowFile } from '../../hooks/playground/useDashboardInputForm';
import styles from '../../styles/pages.module.css';
interface PageRendererProps {
@ -362,7 +363,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
onSave?: (sectionId: string, data: any) => Promise<void>;
getNestedValue: (obj: any, path: string) => any;
setNestedValue: (obj: any, path: string, value: any) => any;
}> = ({ sections, formData, fieldsBySection, loadingBySection, errorsBySection, onSave, getNestedValue, setNestedValue }) => {
}> = ({ sections, formData, fieldsBySection, loadingBySection, errorsBySection, onSave, getNestedValue, setNestedValue: _setNestedValue }) => {
const [sectionFormData, setSectionFormData] = useState<Record<string, any>>({});
const [sectionSaveLoading, setSectionSaveLoading] = useState<Record<string, boolean>>({});
const [sectionSaveMessages, setSectionSaveMessages] = useState<Record<string, { type: 'success' | 'error', text: string } | null>>({});
@ -677,6 +678,46 @@ const PageRenderer: React.FC<PageRendererProps> = ({
// Render content based on type
const renderContent = (content: PageContent) => {
// Wrapper functions to convert fileId-based handlers to WorkflowFile-based handlers
// These are defined at the top level of renderContent so they're accessible in all content cases
const wrapFileDelete: ((file: WorkflowFile) => Promise<void>) | undefined = hookData?.handleFileDelete ? async (file: WorkflowFile) => {
if (!hookData?.handleFileDelete || !file) return;
const handler = hookData.handleFileDelete as any;
// Check if handler expects fileId (string) or file (WorkflowFile)
if (file?.fileId && typeof file.fileId === 'string') {
// Try fileId signature first (handler might be (fileId: string, ...) => Promise<boolean>)
try {
const result = handler(file.fileId);
if (result instanceof Promise) await result;
return;
} catch {
// Fall through to file signature
}
}
// Try file signature (handler might be (file: WorkflowFile) => Promise<void>)
const result = handler(file);
if (result instanceof Promise) await result;
} : undefined;
const wrapFileRemove: ((file: WorkflowFile) => Promise<void>) | undefined = hookData?.handleFileRemove ? async (file: WorkflowFile) => {
if (!hookData?.handleFileRemove || !file) return;
const handler = hookData.handleFileRemove as any;
// Check if handler expects fileId (string) or file (WorkflowFile)
if (file?.fileId && typeof file.fileId === 'string') {
// Try fileId signature first (handler might be (fileId: string) => void | Promise<void>)
try {
const result = handler(file.fileId);
if (result instanceof Promise) await result;
return;
} catch {
// Fall through to file signature
}
}
// Try file signature (handler might be (file: WorkflowFile) => Promise<void>)
const result = handler(file);
if (result instanceof Promise) await result;
} : undefined;
switch (content.type) {
case 'heading':
const HeadingTag = `h${content.level || 2}` as keyof React.JSX.IntrinsicElements;
@ -834,7 +875,14 @@ const PageRenderer: React.FC<PageRendererProps> = ({
}
} else {
// Non-function disabled value
disabledFn = () => action.disabled as boolean | { disabled: boolean; message?: string };
const disabledValue = action.disabled;
if (typeof disabledValue === 'boolean') {
disabledFn = () => disabledValue;
} else if (disabledValue && typeof disabledValue === 'object' && 'disabled' in disabledValue) {
disabledFn = () => disabledValue as { disabled: boolean; message?: string };
} else {
disabledFn = () => false;
}
}
} else {
disabledFn = () => false;
@ -949,7 +997,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
<DropdownSelect
items={hookData.promptItems || []}
selectedItemId={hookData.selectedPromptId || null}
onSelect={hookData.onPromptSelect}
onSelect={hookData.onPromptSelect || (() => {})}
placeholder={t('dashboard.prompt.select', 'Select a prompt')}
emptyMessage={t('dashboard.prompt.empty', 'No prompts available')}
headerText={t('dashboard.prompt.header', 'Select Prompt')}
@ -966,7 +1014,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
<DropdownSelect
items={hookData.workflowModeItems || []}
selectedItemId={hookData.selectedWorkflowMode || null}
onSelect={hookData.onWorkflowModeSelect}
onSelect={hookData.onWorkflowModeSelect || (() => {})}
placeholder={t('dashboard.workflow.mode.select', 'Select workflow mode')}
emptyMessage={t('dashboard.workflow.mode.empty', 'No modes available')}
headerText={t('dashboard.workflow.mode.header', 'Workflow Mode')}
@ -1033,7 +1081,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
},
{
type: 'remove',
onAction: hookData.handleFileRemove,
onAction: wrapFileRemove,
showOnlyForPending: true,
idField: 'fileId',
loadingStateName: 'removingItems'
@ -1045,9 +1093,12 @@ const PageRenderer: React.FC<PageRendererProps> = ({
idField: 'fileId'
}
]}
onDelete={hookData.handleFileDelete}
onRemove={hookData.handleFileRemove}
onAttach={hookData.handleFileAttach} // Allow attaching files for next message
onDelete={wrapFileDelete}
onRemove={wrapFileRemove}
onAttach={hookData.handleFileAttach ? async (fileId: string) => {
const result = hookData.handleFileAttach!(fileId);
if (result instanceof Promise) await result;
} : undefined}
deletingFiles={hookData.deletingFiles || new Set()}
previewingFiles={hookData.previewingFiles || new Set()}
removingFiles={new Set()} // Can be tracked if needed
@ -1080,7 +1131,13 @@ const PageRenderer: React.FC<PageRendererProps> = ({
justifyContent: 'flex-end'
}}>
<UploadButton
onUpload={hookData.handleFileUploadAndAttach || hookData.handleFileUpload}
onUpload={hookData.handleFileUploadAndAttach || hookData.handleFileUpload ? async (file: File) => {
const handler = hookData.handleFileUploadAndAttach || hookData.handleFileUpload;
if (handler) {
// Handler returns Promise<{ success, data }>, but UploadButton expects Promise<void>
await handler(file);
}
} : async () => {}}
disabled={hookData.isSubmitting || false}
loading={hookData.uploadingFile || false}
variant="primary"
@ -1207,7 +1264,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
<DropdownSelect
items={hookData.promptItems || []}
selectedItemId={hookData.selectedPromptId || null}
onSelect={hookData.onPromptSelect}
onSelect={hookData.onPromptSelect || (() => {})}
placeholder={t('dashboard.prompt.select', 'Select a prompt')}
emptyMessage={t('dashboard.prompt.empty', 'No prompts available')}
headerText={t('dashboard.prompt.header', 'Select Prompt')}
@ -1222,7 +1279,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
<DropdownSelect
items={hookData.workflowModeItems || []}
selectedItemId={hookData.selectedWorkflowMode || null}
onSelect={hookData.onWorkflowModeSelect}
onSelect={hookData.onWorkflowModeSelect || (() => {})}
placeholder={t('dashboard.workflow.mode.select', 'Select workflow mode')}
emptyMessage={t('dashboard.workflow.mode.empty', 'No modes available')}
headerText={t('dashboard.workflow.mode.header', 'Workflow Mode')}
@ -1320,8 +1377,8 @@ const PageRenderer: React.FC<PageRendererProps> = ({
showDocuments={config.showDocuments !== false}
showMetadata={config.showMetadata !== false}
showProgress={config.showProgress !== false}
onFileDelete={hookData?.handleFileDelete}
onFileRemove={hookData?.handleFileRemove}
onFileDelete={wrapFileDelete}
onFileRemove={wrapFileRemove}
deletingFiles={hookData?.deletingFiles}
previewingFiles={hookData?.previewingFiles}
removingFiles={hookData?.removingFiles}
@ -1334,8 +1391,8 @@ const PageRenderer: React.FC<PageRendererProps> = ({
key={message.id || index}
message={cleanMessage}
showDocuments={config.showDocuments !== false}
onFileDelete={hookData?.handleFileDelete}
onFileRemove={hookData?.handleFileRemove}
onFileDelete={wrapFileDelete}
onFileRemove={wrapFileRemove}
deletingFiles={hookData?.deletingFiles}
previewingFiles={hookData?.previewingFiles}
removingFiles={hookData?.removingFiles}
@ -1356,8 +1413,8 @@ const PageRenderer: React.FC<PageRendererProps> = ({
showMetadata={config.showMetadata !== false}
showProgress={config.showProgress !== false}
emptyMessage={config.emptyMessage ? resolveLanguageText(config.emptyMessage, t) : undefined}
onFileDelete={hookData?.handleFileDelete}
onFileRemove={hookData?.handleFileRemove}
onFileDelete={wrapFileDelete}
onFileRemove={wrapFileRemove}
deletingFiles={hookData?.deletingFiles}
previewingFiles={hookData?.previewingFiles}
removingFiles={hookData?.removingFiles}
@ -1393,12 +1450,16 @@ const PageRenderer: React.FC<PageRendererProps> = ({
case 'log': {
const logConfig = content.logConfig || {};
const logEntries = Array.isArray(hookData?.logs) ? hookData.logs : [];
const dashboardTree = hookData?.dashboardTree;
const onToggleOperationExpanded = hookData?.onToggleOperationExpanded;
const getChildOperations = hookData?.getChildOperations;
return (
<div key={content.id} className={styles.logSection}>
<Log
emptyMessage={logConfig.emptyMessage ? resolveLanguageText(logConfig.emptyMessage, t) : undefined}
logs={logEntries}
dashboardTree={dashboardTree}
onToggleOperationExpanded={onToggleOperationExpanded}
getChildOperations={getChildOperations}
/>
</div>
);
@ -1467,6 +1528,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
workflowStatus={hookData?.workflowStatus}
currentRound={hookData?.currentRound || hookData?.workflowData?.currentRound}
isRunning={hookData?.isRunning || false}
latestStats={hookData?.latestStats || null}
/>
</div>
)}
@ -1657,6 +1719,14 @@ const PageRenderer: React.FC<PageRendererProps> = ({
return await createOperation(formData);
};
// Evaluate disabled property if it's a function
const isDisabled = typeof button.disabled === 'function'
? button.disabled(hookData)
: button.disabled ?? false;
const disabledValue = typeof isDisabled === 'object' && isDisabled !== null && 'disabled' in isDisabled
? isDisabled.disabled
: Boolean(isDisabled);
return (
<CreateButton
key={button.id}
@ -1667,7 +1737,7 @@ const PageRenderer: React.FC<PageRendererProps> = ({
variant={button.variant || 'primary'}
size={button.size || 'md'}
icon={button.icon}
disabled={button.disabled}
disabled={disabledValue}
onSuccess={() => {
// Refetch data after successful creation
if (hookData.refetch) {

View file

@ -3,6 +3,7 @@ import { allPageData, SidebarItem } from './data';
import { useLanguage } from '../../providers/language/LanguageContext';
import { resolveLanguageText } from './pageInterface';
import { usePermissions } from '../../hooks/usePermissions';
import { getUserDataCache } from '../../utils/userCache';
import { FaHome, FaCogs } from 'react-icons/fa';
// Configuration for parent groups that don't have a page definition
@ -121,7 +122,7 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
}
// Process parent groups
for (const [parentPath, parentGroup] of parentGroups.entries()) {
for (const [_parentPath, parentGroup] of parentGroups.entries()) {
// Filter subpages by RBAC access
const accessibleSubpages = [];
for (const subpage of parentGroup.subpages) {
@ -159,16 +160,39 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
.filter(page => !page.parentPath && !page.hide && page.showInSidebar !== false)
.sort((a, b) => (a.order || 0) - (b.order || 0));
// Log user info for debugging
const cachedUser = getUserDataCache();
console.log('👤 SidebarProvider: Current user info:', {
username: cachedUser?.username,
roleLabels: cachedUser?.roleLabels,
roleLabelsLength: Array.isArray(cachedUser?.roleLabels) ? cachedUser.roleLabels.length : 0,
privilege: cachedUser?.privilege
});
// Process each main page
console.log('📋 SidebarProvider: Processing pages, total:', mainPages.length, 'pages to check');
const pageAccessResults: Array<{ path: string; name: string; hasAccess: boolean }> = [];
for (const pageData of mainPages) {
console.log('🔍 SidebarProvider: Checking access for page:', {
path: pageData.path,
name: pageData.name,
hasSubpages: pageData.hasSubpages
});
// Check RBAC permissions
try {
const hasRBACAccess = await canView('UI', pageData.path);
console.log('🔍 SidebarProvider: RBAC check result:', {
path: pageData.path,
hasAccess: hasRBACAccess
});
if (!hasRBACAccess) {
console.log('⛔ SidebarProvider: Page hidden due to RBAC:', pageData.path);
continue;
}
} catch (error) {
console.error(`Error checking RBAC access for ${pageData.path}:`, error);
console.error(`❌ SidebarProvider: Error checking RBAC access for ${pageData.path}:`, error);
continue;
}
@ -183,18 +207,49 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
// Filter subpages by RBAC access
const accessibleSubpages = [];
console.log('📋 SidebarProvider: Checking subpages for:', {
parentPath: pageData.path,
totalSubpages: allSubpages.length
});
for (const subpage of allSubpages) {
try {
console.log('🔍 SidebarProvider: Checking subpage access:', {
parentPath: pageData.path,
subpagePath: subpage.path,
subpageName: subpage.name
});
const hasSubpageRBACAccess = await canView('UI', subpage.path);
console.log('🔍 SidebarProvider: Subpage RBAC result:', {
subpagePath: subpage.path,
hasAccess: hasSubpageRBACAccess
});
if (hasSubpageRBACAccess) {
accessibleSubpages.push(subpage);
console.log('✅ SidebarProvider: Subpage added:', subpage.path);
} else {
console.log('⛔ SidebarProvider: Subpage hidden due to RBAC:', subpage.path);
}
} catch (error) {
console.error(`Error checking RBAC access for subpage ${subpage.path}:`, error);
console.error(`❌ SidebarProvider: Error checking RBAC access for subpage ${subpage.path}:`, error);
}
}
console.log('📋 SidebarProvider: Subpage filtering complete:', {
parentPath: pageData.path,
totalSubpages: allSubpages.length,
accessibleSubpages: accessibleSubpages.length,
accessiblePaths: accessibleSubpages.map(s => s.path)
});
if (accessibleSubpages.length > 0) {
console.log('✅ SidebarProvider: Adding parent page with subpages:', {
path: pageData.path,
name: pageData.name,
subpagesCount: accessibleSubpages.length
});
// Create expandable item with submenu
items.push({
id: pageData.id,
@ -212,6 +267,10 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
});
} else {
// No accessible subpages, show as regular item
console.log('✅ SidebarProvider: Adding parent page without accessible subpages:', {
path: pageData.path,
name: pageData.name
});
items.push({
id: pageData.id,
name: resolveLanguageText(pageData.name, t),
@ -223,6 +282,10 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
}
} else {
// Regular items without subpages
console.log('✅ SidebarProvider: Adding regular page:', {
path: pageData.path,
name: pageData.name
});
items.push({
id: pageData.id,
name: resolveLanguageText(pageData.name, t),
@ -235,19 +298,50 @@ export const SidebarProvider: React.FC<SidebarProviderProps> = ({ children }) =>
}
// Sort all items by order
return items.sort((a, b) => (a.order || 0) - (b.order || 0));
const sortedItems = items.sort((a, b) => (a.order || 0) - (b.order || 0));
// Summary of page access checks
const accessiblePages = pageAccessResults.filter(r => r.hasAccess);
const deniedPages = pageAccessResults.filter(r => !r.hasAccess);
console.log('📊 SidebarProvider: Page access summary:', {
totalPagesChecked: pageAccessResults.length,
accessiblePages: accessiblePages.length,
deniedPages: deniedPages.length,
accessiblePagePaths: accessiblePages.map(p => p.path),
deniedPagePaths: deniedPages.map(p => p.path),
deniedPageDetails: deniedPages.map(p => ({ path: p.path, name: p.name }))
});
console.log('📊 SidebarProvider: Final sidebar items built and sorted:', {
totalItems: sortedItems.length,
sortedPaths: sortedItems.map(item => item.link),
items: sortedItems.map(item => ({
id: item.id,
link: item.link,
name: item.name,
hasSubmenu: !!item.submenu,
submenuCount: item.submenu?.length || 0
}))
});
return sortedItems;
};
// Refresh sidebar items
const refreshSidebar = async () => {
console.log('🔄 SidebarProvider: Refreshing sidebar items...');
setLoading(true);
setError(null);
try {
const items = await getSidebarItems();
console.log('✅ SidebarProvider: Setting sidebar items:', {
count: items.length,
items: items.map(item => ({ id: item.id, link: item.link, name: item.name }))
});
setSidebarItems(items);
} catch (err) {
console.error('Error refreshing sidebar:', err);
console.error('❌ SidebarProvider: Error refreshing sidebar:', err);
setError(err instanceof Error ? err.message : 'Failed to load sidebar items');
} finally {
setLoading(false);

View file

@ -1,7 +1,6 @@
import { useCallback } from 'react';
import { GenericPageData } from '../../pageInterface';
import { FaGoogle, FaMicrosoft, FaLink } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { useConnections } from '../../../../hooks/useConnections';
// Helper function to convert attribute definitions to column config
@ -233,9 +232,6 @@ export const connectionsPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -3,7 +3,6 @@ import { LuTicket } from 'react-icons/lu';
import { IoMdSend } from 'react-icons/io';
import { MdStop } from 'react-icons/md';
import { HiOutlineCollection } from 'react-icons/hi';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { createDashboardHook } from '../../../../hooks/usePlayground';
export const dashboardPageData: GenericPageData = {
@ -35,6 +34,7 @@ export const dashboardPageData: GenericPageData = {
placeholder: 'dashboard.workflow.select',
emptyMessage: 'dashboard.workflow.empty',
headerText: 'dashboard.workflow.header',
onSelect: () => {}, // Placeholder - actual handler comes from dataSource.onSelectMethod
dataSource: {
itemsProperty: 'workflowItems',
selectedIdProperty: 'selectedWorkflowId',
@ -81,9 +81,6 @@ export const dashboardPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: true,
preserveState: true,

View file

@ -1,7 +1,6 @@
import { useCallback } from 'react';
import { GenericPageData } from '../../pageInterface';
import { FaRegFileAlt, FaUpload } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { useUserFiles, useFileOperations } from '../../../../hooks/useFiles';
// Helper function to convert attribute definitions to column config
@ -272,9 +271,6 @@ export const filesPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -1,7 +1,6 @@
import { GenericPageData } from '../../pageInterface';
import { FaTable, FaBuilding } from 'react-icons/fa';
import { FaTable } from 'react-icons/fa';
import { IoMdSend } from 'react-icons/io';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { usePekTablesContext } from '../../../../contexts/PekTablesContext';
import PekTablesDropdown from './pek-tables/PekTablesDropdown';
import PekTablesPageWrapper from './pek-tables/PekTablesPageWrapper';
@ -104,9 +103,6 @@ export const pekTablesPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -1,7 +1,6 @@
import { GenericPageData } from '../../pageInterface';
import { FaBuilding } from 'react-icons/fa';
import { IoMdSend } from 'react-icons/io';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import PekLocationInput from './pek/PekLocationInput';
import PekMapView from './pek/PekMapView';
import { usePek } from '../../../../hooks/usePek';
@ -93,9 +92,6 @@ export const pekPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -6,7 +6,7 @@
.fieldsRow {
display: flex;
gap: 1rem;
align-items: flex-start;
align-items: flex-end;
}
.fieldWrapper {
@ -15,9 +15,8 @@
.buttonsWrapper {
display: flex;
flex-direction: column;
flex-direction: row;
gap: 0.5rem;
margin-top: 1.5rem;
min-width: 150px;
}
@ -35,9 +34,7 @@
}
.buttonsWrapper {
flex-direction: row;
width: 100%;
margin-top: 0.5rem;
}
.fieldWrapper {
@ -57,7 +54,6 @@
.buttonsWrapper {
width: 100%;
margin-top: 0.5rem;
}
.searchButton,

View file

@ -7,16 +7,16 @@ import styles from './PekLocationInput.module.css';
const PekLocationInput: React.FC = () => {
const {
kanton,
setKanton,
gemeinde,
setGemeinde,
kanton: _kanton,
setKanton: _setKanton,
gemeinde: _gemeinde,
setGemeinde: _setGemeinde,
adresse,
setAdresse,
buildLocationString,
useCurrentLocation,
isGettingLocation,
locationError,
locationError: _locationError,
searchParcel,
isSearchingParcel
} = usePekContext();
@ -36,45 +36,6 @@ const PekLocationInput: React.FC = () => {
return (
<div className={styles.locationInputContainer}>
<div className={styles.fieldsRow}>
<div className={styles.fieldWrapper}>
<TextField
value={kanton}
onChange={setKanton}
placeholder="z.B. BE"
label="Kanton"
error={locationError && !gemeinde && !adresse ? locationError : undefined}
disabled={isGettingLocation || isSearchingParcel}
size="md"
type="text"
name="kanton"
onKeyDown={(e) => {
if (e.key === 'Enter') {
e.preventDefault();
const gemeindeInput = document.querySelector('input[name="gemeinde"]') as HTMLInputElement;
if (gemeindeInput) gemeindeInput.focus();
}
}}
/>
</div>
<div className={styles.fieldWrapper}>
<TextField
value={gemeinde}
onChange={setGemeinde}
placeholder="z.B. Bern"
label="Gemeinde"
disabled={isGettingLocation || isSearchingParcel}
size="md"
type="text"
name="gemeinde"
onKeyDown={(e) => {
if (e.key === 'Enter') {
e.preventDefault();
const adresseInput = document.querySelector('input[name="adresse"]') as HTMLInputElement;
if (adresseInput) adresseInput.focus();
}
}}
/>
</div>
<div className={styles.fieldWrapper}>
<TextField
value={adresse}

View file

@ -1,7 +1,6 @@
import { useCallback } from 'react';
import { GenericPageData } from '../../pageInterface';
import { FaLightbulb, FaPlus } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { usePrompts, usePromptOperations } from '../../../../hooks/usePrompts';
// Helper function to convert attribute definitions to column config
@ -267,9 +266,6 @@ export const promptsPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -1,6 +1,5 @@
import { GenericPageData } from '../../pageInterface';
import { FaCog } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { createSettingsHook } from '../../../../hooks/useSettings';
export const settingsPageData: GenericPageData = {
@ -14,9 +13,6 @@ export const settingsPageData: GenericPageData = {
title: 'settings.title',
subtitle: 'Manage your account settings and preferences',
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preserveState: false,

View file

@ -1,7 +1,6 @@
import { GenericPageData } from '../../pageInterface';
import { FaDownload, FaTrash, FaSearch } from 'react-icons/fa';
import { IoIosDocument } from 'react-icons/io';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
export const speechTranscriptsPageData: GenericPageData = {
id: '8-1',
@ -99,9 +98,6 @@ export const speechTranscriptsPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.speechSignup,
// Page behavior
persistent: false,
preload: false,

View file

@ -1,6 +1,5 @@
import { GenericPageData } from '../../pageInterface';
import { FaRegFileAlt, FaMicrophone, FaCog, FaHistory } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
export const speechPageData: GenericPageData = {
id: 'start-speech',
@ -50,8 +49,7 @@ export const speechPageData: GenericPageData = {
onClick: () => {
console.log('Opening transcript history...');
// Navigate to transcripts
},
privilegeChecker: privilegeCheckers.speechSignup
}
}
],
@ -111,12 +109,8 @@ export const speechPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Subpage support
hasSubpages: true,
subpagePrivilegeChecker: privilegeCheckers.speechSignup,
// Page behavior
persistent: false,

View file

@ -1,7 +1,6 @@
import { useCallback } from 'react';
import { GenericPageData } from '../../pageInterface';
import { FaUsers, FaPlus } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { useOrgUsers, useUserOperations } from '../../../../hooks/useUsers';
// Helper function to convert attribute definitions to column config
@ -268,9 +267,6 @@ export const teamMembersPageData: GenericPageData = {
}
],
// Privilege system - only admin and sysadmin can access
privilegeChecker: privilegeCheckers.adminRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -1,7 +1,6 @@
import { useCallback } from 'react';
import { GenericPageData } from '../../pageInterface';
import { FaProjectDiagram } from 'react-icons/fa';
import { privilegeCheckers } from '../../../../utils/privilegeCheckers';
import { useUserWorkflows, useWorkflowOperations } from '../../../../hooks/useWorkflows';
// Helper function to convert attribute definitions to column config
@ -174,6 +173,7 @@ export const workflowsPageData: GenericPageData = {
idField: 'id',
nameField: 'name',
navigateTo: 'start/dashboard',
mode: 'workflow', // Set mode to 'workflow' to select workflow instead of setting prompt
// Only show if user has read permission (permissions.read !== 'n')
disabled: (hookData: any) => {
if (!hookData?.permissions) return { disabled: false };
@ -220,9 +220,6 @@ export const workflowsPageData: GenericPageData = {
}
],
// Privilege system
privilegeChecker: privilegeCheckers.viewerRole,
// Page behavior
persistent: false,
preload: false,

View file

@ -9,13 +9,14 @@ export type PrivilegeChecker = () => boolean | Promise<boolean>;
export interface ButtonFormField {
key: string;
label: string | LanguageText;
type: 'string' | 'boolean' | 'email' | 'textarea' | 'date' | 'enum' | 'readonly';
type: 'string' | 'boolean' | 'email' | 'textarea' | 'date' | 'enum' | 'readonly' | 'multiselect';
required?: boolean;
placeholder?: string | LanguageText;
minRows?: number;
maxRows?: number;
validator?: (value: any) => string | null;
defaultValue?: any;
options?: string[] | Array<{ value: string | number; label: string }>; // For enum/multiselect fields
}
// Dropdown configuration for header dropdown buttons
@ -37,6 +38,7 @@ export interface DropdownConfig<T = any> {
itemsProperty?: string; // Property name in hookData that contains items array
selectedIdProperty?: string; // Property name in hookData that contains selectedItemId
onSelectMethod?: string; // Method name in hookData for onSelect callback
loadingProperty?: string; // Property name in hookData that contains loading state
};
}
@ -49,7 +51,6 @@ export interface PageButton {
icon?: IconType;
onClick?: (hookData?: any) => void | Promise<void>;
disabled?: boolean | ((hookData?: any) => boolean | { disabled: boolean; message?: string });
privilegeChecker?: PrivilegeChecker;
// Form configuration for create buttons
formConfig?: {
fields: ButtonFormField[];
@ -128,7 +129,6 @@ export interface PageContent {
items?: (string | LanguageText)[]; // For lists
language?: string; // For code blocks
customComponent?: React.ComponentType<any>;
privilegeChecker?: PrivilegeChecker;
// Table-specific properties
tableConfig?: TableContentConfig;
// Input form-specific properties
@ -161,9 +161,24 @@ export interface GenericDataHook {
columns?: any[]; // Optional columns configuration
// File operations
handleUpload?: (file: File) => Promise<{ success: boolean; data: any }>; // For file upload functionality
handleFileUpload?: (file: File) => Promise<{ success: boolean; data: any }>; // Alias for handleUpload
handleDownload?: (fileId: string, fileName: string) => Promise<boolean>; // For file download functionality
handleDelete?: (fileId: string, onOptimisticDelete?: () => void) => Promise<boolean>; // For file delete functionality
handleFileDelete?: ((fileId: string, onOptimisticDelete?: () => void) => Promise<boolean>) | ((file: any) => Promise<void>); // Can accept fileId or WorkflowFile
handlePreview?: (fileId: string, fileName: string, mimeType?: string) => Promise<any>; // For file preview functionality
// File management properties
workflowFiles?: any[]; // Files connected to workflow
pendingFiles?: any[]; // Files pending attachment
allUserFiles?: any[]; // All user files
handleFileRemove?: ((fileId: string) => Promise<void> | void) | ((file: any) => Promise<void> | void); // Can accept fileId or WorkflowFile
handleFileAttach?: (fileId: string) => Promise<void>; // Attach file to workflow (always returns Promise)
handleFileUploadAndAttach?: (file: File) => Promise<{ success: boolean; data: any }>; // Upload and attach file
uploadingFile?: boolean; // Loading state for file upload
deletingFiles?: Set<string>; // Set of file IDs being deleted
previewingFiles?: Set<string>; // Set of file IDs being previewed
removingFiles?: Set<string>; // Set of file IDs being removed
isFileAttachmentPopupOpen?: boolean; // Whether file attachment popup is open
setIsFileAttachmentPopupOpen?: (open: boolean) => void; // Set file attachment popup state
// FormGenerator specific handlers
onDelete?: (row: any) => Promise<void>; // For single item deletion
onDeleteMultiple?: (rows: any[]) => Promise<void>; // For multiple item deletion
@ -172,14 +187,37 @@ export interface GenericDataHook {
onInputChange?: (value: string) => void;
handleSubmit?: () => Promise<void>; // No parameters, uses internal inputValue
isSubmitting?: boolean;
// Prompt selector properties
promptPermission?: {
view?: boolean;
read?: string;
};
promptItems?: Array<{ id: string | number; label: string; value: any }>;
selectedPromptId?: string | number | null;
onPromptSelect?: (item: { id: string | number; label: string; value: any } | null) => void | Promise<void>;
promptsLoading?: boolean;
// Workflow mode selector properties
workflowModeItems?: Array<{ id: string | number; label: string; value: any }>;
selectedWorkflowMode?: string | number | null;
onWorkflowModeSelect?: (item: { id: string | number; label: string; value: any } | null) => void | Promise<void>;
// Workflow lifecycle state
workflowId?: string;
workflowStatus?: string;
workflowData?: {
currentRound?: number;
[key: string]: any;
};
isRunning?: boolean;
currentRound?: number; // Current workflow round
latestStats?: any; // Latest workflow statistics
// Messages from workflow
messages?: any[];
// Logs from workflow
logs?: any[];
// Dashboard log tree
dashboardTree?: any; // Dashboard log tree structure
onToggleOperationExpanded?: (operationId: string) => void;
getChildOperations?: (parentId: string | null) => string[];
// Message overlay component
MessageOverlayComponent?: () => React.ReactElement;
// Settings-specific properties
@ -188,6 +226,8 @@ export interface GenericDataHook {
settingsLoading?: Record<string, boolean>; // Loading state per section
settingsErrors?: Record<string, string | null>; // Error state per section
saveSection?: (sectionId: string, data: any) => Promise<void>; // Save handler for a section
// Dropdown data source loading property
[key: string]: any; // Allow additional properties for dynamic data sources
}
// Action button configuration
@ -275,9 +315,6 @@ export interface GenericPageData {
// Content sections
content?: PageContent[];
// Privilege system
privilegeChecker?: PrivilegeChecker;
// Page behavior
persistent?: boolean;
preserveState?: boolean;
@ -287,7 +324,6 @@ export interface GenericPageData {
// Subpage support
hasSubpages?: boolean;
subpagePrivilegeChecker?: PrivilegeChecker;
// Lifecycle hooks
onActivate?: () => void | Promise<void>;
@ -312,8 +348,8 @@ export interface PageDataFile {
export interface SidebarItem {
id: string;
name: string;
link: string;
icon?: IconType;
link: string | undefined; // Allow undefined for parent groups that aren't clickable pages
icon?: IconType | React.ComponentType<React.SVGProps<SVGSVGElement>>; // Allow both IconType and SVG components
moduleEnabled: boolean;
order: number;
submenu?: SidebarSubmenuItemData[];

View file

@ -1,4 +1,4 @@
import { useState, useEffect, useCallback, useMemo } from 'react';
import { useState, useEffect, useCallback, useMemo, useRef } from 'react';
import { useApiRequest } from '../useApi';
import { useWorkflowSelection } from '../../contexts/WorkflowSelectionContext';
import { useFileContext } from '../../contexts/FileContext';
@ -9,7 +9,9 @@ import { deleteFileFromMessageApi } from '../../api/workflowApi';
import type { Workflow, WorkflowMessage } from '../../api/workflowApi';
import { useWorkflowLifecycle } from './useWorkflowLifecycle';
import { useWorkflows } from './useWorkflows';
import { useDashboardLogTree } from './useDashboardLogTree';
import { extractFileIdsFromMessage, convertFilesToDocuments, sortMessages } from './playgroundUtils';
import type { WorkflowLog as LogTypesWorkflowLog } from '../../components/UiComponents/Log/LogTypes';
export interface WorkflowFile {
id: string;
@ -44,13 +46,28 @@ export function useDashboardInputForm() {
isStopping,
startingWorkflow,
messages,
logs,
dashboardLogs,
unifiedContentLogs,
latestStats,
startWorkflow,
stopWorkflow,
resetWorkflow,
selectWorkflow,
setWorkflowStatusOptimistic
} = useWorkflowLifecycle();
// Dashboard log tree hook
const {
tree: dashboardTree,
processDashboardLogs,
clearDashboard,
toggleOperationExpanded,
updateCurrentRound,
getChildOperations
} = useDashboardLogTree();
// Ref to prevent infinite sync loops
const isSyncingRef = useRef(false);
const fileContext = useFileContext();
const { request } = useApiRequest();
@ -83,11 +100,38 @@ export function useDashboardInputForm() {
checkPermissions();
}, [canView, checkPermission]);
// Sync context -> lifecycle: When context selection changes, update lifecycle
useEffect(() => {
if (isSyncingRef.current) return;
if (selectedWorkflowId && selectedWorkflowId !== workflowId) {
selectWorkflow(selectedWorkflowId);
isSyncingRef.current = true;
selectWorkflow(selectedWorkflowId).finally(() => {
isSyncingRef.current = false;
});
} else if (!selectedWorkflowId && workflowId) {
// If context is cleared but lifecycle still has a workflow, reset lifecycle
isSyncingRef.current = true;
resetWorkflow();
isSyncingRef.current = false;
}
}, [selectedWorkflowId, workflowId, selectWorkflow]);
}, [selectedWorkflowId, workflowId, selectWorkflow, resetWorkflow]);
// Sync lifecycle -> context: When lifecycle workflowId changes, update context
useEffect(() => {
if (isSyncingRef.current) return;
if (workflowId && workflowId !== selectedWorkflowId) {
isSyncingRef.current = true;
selectWorkflowFromContext(workflowId);
isSyncingRef.current = false;
} else if (!workflowId && selectedWorkflowId) {
// If lifecycle is cleared but context still has selection, clear context
isSyncingRef.current = true;
clearWorkflowFromContext();
isSyncingRef.current = false;
}
}, [workflowId, selectedWorkflowId, selectWorkflowFromContext, clearWorkflowFromContext]);
useEffect(() => {
const handleSetInput = (event: CustomEvent<{ value: string }>) => {
@ -104,6 +148,73 @@ export function useDashboardInputForm() {
}, []);
const { workflows, loading: workflowsLoading, refetch: refetchWorkflows } = useWorkflows();
// Track processed log IDs to avoid reprocessing
const processedLogIdsRef = useRef<Set<string>>(new Set());
const lastWorkflowIdRef = useRef<string | null>(null);
const lastDashboardLogsLengthRef = useRef<number>(0);
// Clear processed logs when workflow changes
useEffect(() => {
if (workflowId !== lastWorkflowIdRef.current) {
processedLogIdsRef.current.clear();
lastWorkflowIdRef.current = workflowId || null;
lastDashboardLogsLengthRef.current = 0;
if (!workflowId) {
clearDashboard(true);
}
}
}, [workflowId, clearDashboard]);
// Process dashboard logs when they change (only new logs)
useEffect(() => {
if (!dashboardLogs || dashboardLogs.length === 0) {
lastDashboardLogsLengthRef.current = 0;
return;
}
// Only process if the array length changed (indicating new logs)
if (dashboardLogs.length === lastDashboardLogsLengthRef.current) {
return;
}
// Filter to only new logs that haven't been processed
const newLogs = dashboardLogs.filter(log => {
const logId = log.id || `${log.operationId}-${log.timestamp}`;
if (processedLogIdsRef.current.has(logId)) {
return false;
}
processedLogIdsRef.current.add(logId);
return true;
});
// Only process if there are new logs
if (newLogs.length > 0) {
// Convert API WorkflowLog format to LogTypes WorkflowLog format
const convertedLogs: LogTypesWorkflowLog[] = newLogs.map(log => ({
id: log.id || `${log.operationId || 'unknown'}-${log.timestamp || Date.now()}`,
workflowId: log.workflowId || '',
message: log.message || '',
type: log.type,
timestamp: log.timestamp || Date.now(),
status: log.status,
progress: log.progress,
performance: log.performance,
parentId: log.parentId,
operationId: log.operationId
}));
processDashboardLogs(convertedLogs);
}
lastDashboardLogsLengthRef.current = dashboardLogs.length;
}, [dashboardLogs, processDashboardLogs]);
// Update current round in dashboard tree when it changes
useEffect(() => {
if (currentRound !== undefined) {
updateCurrentRound(currentRound);
}
}, [currentRound, updateCurrentRound]);
const workflowFiles = useMemo(() => {
const fileMap = new Map<string, WorkflowFile>();
@ -254,7 +365,7 @@ export function useDashboardInputForm() {
return allMessages.sort(sortMessages);
}, [messages, optimisticMessage, workflowId]);
const handleFileUpload = useCallback(async (file: File) => {
const handleFileUpload = useCallback(async (file: File): Promise<{ success: boolean; data: any }> => {
const result = await fileContext.handleFileUpload(file, workflowId || undefined);
if (result.success && result.fileData) {
@ -262,27 +373,32 @@ export function useDashboardInputForm() {
const fileData = responseData.file || responseData;
const fileId = fileData?.id;
if (!fileId) return;
const newFile: WorkflowFile = {
id: fileId,
fileId: fileId,
fileName: fileData.fileName || file.name,
fileSize: fileData.fileSize || file.size,
mimeType: fileData.mimeType || file.type || 'application/octet-stream',
source: 'user_uploaded'
};
setPendingFiles(prev => {
if (prev.some(f => f.fileId === fileId)) {
return prev;
}
return [...prev, newFile];
});
if (fileId) {
const newFile: WorkflowFile = {
id: fileId,
fileId: fileId,
fileName: fileData.fileName || file.name,
fileSize: fileData.fileSize || file.size,
mimeType: fileData.mimeType || file.type || 'application/octet-stream',
source: 'user_uploaded'
};
setPendingFiles(prev => {
if (prev.some(f => f.fileId === fileId)) {
return prev;
}
return [...prev, newFile];
});
}
}
return {
success: result.success || false,
data: result.fileData || null
};
}, [workflowId, fileContext]);
const handleFileAttach = useCallback(async (fileId: string) => {
const handleFileAttach = useCallback(async (fileId: string): Promise<void> => {
const isInPending = pendingFiles.some(f => f.fileId === fileId);
if (isInPending) {
@ -326,8 +442,8 @@ export function useDashboardInputForm() {
}
}, [pendingFiles, fileContext.files, workflowFiles]);
const handleFileUploadAndAttach = useCallback(async (file: File) => {
await handleFileUpload(file);
const handleFileUploadAndAttach = useCallback(async (file: File): Promise<{ success: boolean; data: any }> => {
return await handleFileUpload(file);
}, [handleFileUpload]);
const handleFileRemove = useCallback(async (file: WorkflowFile) => {
@ -425,7 +541,7 @@ export function useDashboardInputForm() {
return;
}
const selectedMode = workflowMode || 'Automation';
const selectedMode = workflowMode || 'Dynamic';
const apiWorkflowMode: 'Dynamic' | 'Automation' = selectedMode;
const workflowOptions: { workflowId?: string; workflowMode: 'Dynamic' | 'Automation' } = {
@ -451,15 +567,22 @@ export function useDashboardInputForm() {
if (wasNewWorkflow && result.data) {
const workflow = result.data as Workflow;
// Dispatch event first to trigger refetch in useWorkflows
window.dispatchEvent(new CustomEvent('workflowCreated', {
detail: { workflow }
}));
// Refetch workflows list to ensure dropdown is updated
await refetchWorkflows();
// Update context first (this will trigger the sync effect to update lifecycle)
selectWorkflowFromContext(workflow.id);
// Also directly update lifecycle to ensure immediate state update
await selectWorkflow(workflow.id);
} else if (workflowId) {
// For resumed workflows, selectWorkflow will update status from server
// For resumed workflows, ensure context is synced and update lifecycle
selectWorkflowFromContext(workflowId);
await selectWorkflow(workflowId);
}
} else {
@ -478,15 +601,20 @@ export function useDashboardInputForm() {
useEffect(() => {
const handleWorkflowCleared = () => {
// Reset all workflow-related state
setPendingFiles([]);
setOptimisticMessage(null);
// Reset workflow lifecycle state
resetWorkflow();
// Clear context selection
clearWorkflowFromContext();
};
window.addEventListener('workflowCleared', handleWorkflowCleared);
return () => {
window.removeEventListener('workflowCleared', handleWorkflowCleared);
};
}, []);
}, [resetWorkflow, clearWorkflowFromContext]);
const handleWorkflowSelect = useCallback(async (item: { id: string | number; label: string; value: any; metadata?: Record<string, any> } | null) => {
if (item === null) {
@ -543,11 +671,19 @@ export function useDashboardInputForm() {
}, []);
const workflowItems = useMemo(() => {
console.log('🔄 useDashboardInputForm: Computing workflowItems from workflows:', workflows);
if (!workflows || !Array.isArray(workflows)) {
console.warn('⚠️ useDashboardInputForm: workflows is not an array:', workflows);
return [];
}
return workflows.map(workflow => ({
if (workflows.length === 0) {
console.log(' useDashboardInputForm: workflows array is empty');
return [];
}
const items = workflows.map(workflow => ({
id: workflow.id,
label: workflow.name || workflow.id,
value: workflow,
@ -556,6 +692,9 @@ export function useDashboardInputForm() {
workflowMode: workflow.workflowMode
}
}));
console.log(`✅ useDashboardInputForm: Created ${items.length} workflow items:`, items);
return items;
}, [workflows]);
const promptItems = useMemo(() => {
@ -604,9 +743,12 @@ export function useDashboardInputForm() {
currentRound,
isRunning,
messages: displayMessages || [],
logs: logs || [],
logs: unifiedContentLogs || [], // Unified content logs (without operationId)
dashboardTree, // Dashboard log tree (logs with operationId)
onToggleOperationExpanded: toggleOperationExpanded,
getChildOperations,
workflowItems,
selectedWorkflowId: selectedWorkflowId || workflowId || null,
selectedWorkflowId: workflowId || selectedWorkflowId || null,
onWorkflowSelect: handleWorkflowSelect,
workflowsLoading,
promptItems,
@ -632,7 +774,8 @@ export function useDashboardInputForm() {
setIsFileAttachmentPopupOpen,
allUserFiles: fileContext.files || [],
handleFileAttach,
handleFileUploadAndAttach
handleFileUploadAndAttach,
latestStats
};
}

View file

@ -0,0 +1,238 @@
import { useState, useCallback, useRef } from 'react';
import { WorkflowLog } from '../../components/UiComponents/Log/LogTypes';
interface OperationData {
logs: Map<string, WorkflowLog>;
parentId: string | null;
expanded: boolean;
latestProgress: number | null;
latestStatus: string | null;
operationName: string | null; // Stable name from first log
latestMessage: string | null; // Latest status message that updates
}
interface DashboardLogTree {
operations: Map<string, OperationData>;
rootOperations: string[];
logExpandedStates: Map<string, boolean>;
currentRound: number | null;
}
export function useDashboardLogTree() {
const [tree, setTree] = useState<DashboardLogTree>({
operations: new Map(),
rootOperations: [],
logExpandedStates: new Map(),
currentRound: null
});
const treeRef = useRef<DashboardLogTree>(tree);
treeRef.current = tree;
const generateLogId = useCallback((log: WorkflowLog): string => {
if (log.id) {
return log.id;
}
return `log_${Date.now()}_${Math.random().toString(36).substring(2, 9)}`;
}, []);
const processDashboardLogs = useCallback((logs: WorkflowLog[]) => {
setTree(prevTree => {
const newTree: DashboardLogTree = {
operations: new Map(prevTree.operations),
rootOperations: [...prevTree.rootOperations],
logExpandedStates: new Map(prevTree.logExpandedStates),
currentRound: prevTree.currentRound
};
// Process each log
logs.forEach(log => {
if (!log.operationId) {
return; // Skip logs without operationId
}
const operationId = log.operationId;
const logId = generateLogId(log);
// Get or create operation
const existingOperation = newTree.operations.get(operationId);
// Create new logs Map (copy existing logs if updating)
const logsMap = existingOperation
? new Map(existingOperation.logs)
: new Map();
// Store log (Map ensures uniqueness by logId)
logsMap.set(logId, log);
// Determine stable operation name (only set once, never change)
// Always use formatted operationId as the stable name - don't use log messages
// Log messages are status updates and should go in latestMessage, not operationName
let operationName = existingOperation?.operationName || null;
if (operationName === null) {
// Remove UUIDs and timestamps from operationId before formatting
// UUID pattern: 8-4-4-4-12 hex digits (e.g., "1e6d7b14-4f30-40e2-b7a6-748b63b6a7f5")
// Also remove standalone long hex strings that might be timestamps or IDs
let cleanedId = operationId
.replace(/[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}/gi, '') // Remove UUIDs
.replace(/\b[0-9a-f]{32,}\b/gi, '') // Remove long hex strings (timestamps/IDs)
.replace(/\s+/g, ' ') // Normalize whitespace
.trim();
// Format by splitting on dashes/underscores and capitalizing
// This creates a stable, readable name like "Workflow Planning" from "workflow-planning"
const formattedName = cleanedId
.split(/[-_\s]+/)
.filter(word => word.length > 0) // Remove empty strings
.map(word => word.charAt(0).toUpperCase() + word.slice(1).toLowerCase())
.join(' ');
operationName = formattedName || operationId.replace(/[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}/gi, '').trim();
}
// Update latest message (for status tag) - this updates with each poll
const latestMessage = log.message || existingOperation?.latestMessage || null;
// Update parentId if not set yet (from first log entry)
const parentId = existingOperation?.parentId !== null && existingOperation?.parentId !== undefined
? existingOperation.parentId
: (log.parentId !== undefined && log.parentId !== null ? log.parentId : null);
// Update latest progress (use latest value)
const latestProgress = log.progress !== undefined && log.progress !== null
? log.progress
: existingOperation?.latestProgress ?? null;
// Update latest status (use latest value)
const latestStatus = log.status !== undefined && log.status !== null
? log.status
: existingOperation?.latestStatus ?? null;
// Create new operation object to ensure React detects the change
const operation: OperationData = {
logs: logsMap,
parentId,
expanded: existingOperation?.expanded ?? false,
latestProgress,
latestStatus,
operationName,
latestMessage
};
newTree.operations.set(operationId, operation);
});
// Rebuild root operations list (operations without parentId)
// Use Set to ensure uniqueness, then convert back to array
const rootOpsSet = new Set<string>();
newTree.operations.forEach((op, opId) => {
if (op.parentId === null) {
rootOpsSet.add(opId);
}
});
// Sort by timestamp of earliest log entry (chronological order)
newTree.rootOperations = Array.from(rootOpsSet).sort((opIdA, opIdB) => {
const opA = newTree.operations.get(opIdA);
const opB = newTree.operations.get(opIdB);
if (!opA || !opB) return 0;
// Get earliest log timestamp for each operation
const logsA = Array.from(opA.logs.values());
const logsB = Array.from(opB.logs.values());
if (logsA.length === 0 && logsB.length === 0) return 0;
if (logsA.length === 0) return 1; // Put operations without logs at the end
if (logsB.length === 0) return -1;
const earliestA = Math.min(...logsA.map(log => log.timestamp || 0));
const earliestB = Math.min(...logsB.map(log => log.timestamp || 0));
return earliestA - earliestB; // Ascending order (oldest first)
});
return newTree;
});
}, [generateLogId]);
const clearDashboard = useCallback((resetRound: boolean = false) => {
setTree({
operations: new Map(),
rootOperations: [],
logExpandedStates: new Map(),
currentRound: resetRound ? null : treeRef.current.currentRound
});
}, []);
const toggleOperationExpanded = useCallback((operationId: string) => {
setTree(prevTree => {
const operation = prevTree.operations.get(operationId);
if (!operation) {
return prevTree;
}
const newTree: DashboardLogTree = {
...prevTree,
operations: new Map(prevTree.operations)
};
const updatedOperation = {
...operation,
expanded: !operation.expanded
};
newTree.operations.set(operationId, updatedOperation);
return newTree;
});
}, []);
const updateCurrentRound = useCallback((round: number | null) => {
setTree(prevTree => {
// Clear dashboard if round changes
if (prevTree.currentRound !== null && round !== null && prevTree.currentRound !== round) {
return {
operations: new Map(),
rootOperations: [],
logExpandedStates: new Map(),
currentRound: round
};
}
return {
...prevTree,
currentRound: round
};
});
}, []);
const getChildOperations = useCallback((parentId: string | null): string[] => {
const currentTree = treeRef.current;
const childOps = Array.from(currentTree.operations.entries())
.filter(([_, op]) => op.parentId === parentId)
.map(([opId, op]) => ({ opId, op }));
// Sort by timestamp of earliest log entry (chronological order)
return childOps.sort((a, b) => {
const logsA = Array.from(a.op.logs.values());
const logsB = Array.from(b.op.logs.values());
if (logsA.length === 0 && logsB.length === 0) return 0;
if (logsA.length === 0) return 1; // Put operations without logs at the end
if (logsB.length === 0) return -1;
const earliestA = Math.min(...logsA.map(log => log.timestamp || 0));
const earliestB = Math.min(...logsB.map(log => log.timestamp || 0));
return earliestA - earliestB; // Ascending order (oldest first)
}).map(({ opId }) => opId);
}, []);
return {
tree,
processDashboardLogs,
clearDashboard,
toggleOperationExpanded,
updateCurrentRound,
getChildOperations
};
}

View file

@ -5,10 +5,18 @@ import {
type WorkflowMessage,
type WorkflowLog,
type StartWorkflowRequest,
fetchWorkflow as fetchWorkflowApi
fetchWorkflow as fetchWorkflowApi,
fetchChatData
} from '../../api/workflowApi';
import { useWorkflowOperations } from './useWorkflowOperations';
import { sortMessages, sortLogs } from './playgroundUtils';
import { useWorkflowPolling } from './useWorkflowPolling';
interface UnifiedChatDataItem {
type: 'message' | 'log' | 'stat';
item: WorkflowMessage | WorkflowLog | any;
createdAt: number;
}
export function useWorkflowLifecycle() {
const [workflowId, setWorkflowId] = useState<string | null>(null);
@ -16,12 +24,21 @@ export function useWorkflowLifecycle() {
const [currentRound, setCurrentRound] = useState<number | undefined>(undefined);
const [messages, setMessages] = useState<WorkflowMessage[]>([]);
const [logs, setLogs] = useState<WorkflowLog[]>([]);
const [dashboardLogs, setDashboardLogs] = useState<WorkflowLog[]>([]);
const [unifiedContentLogs, setUnifiedContentLogs] = useState<WorkflowLog[]>([]);
const [statusChangedFromRunningAt, setStatusChangedFromRunningAt] = useState<number | null>(null);
const [latestStats, setLatestStats] = useState<{ priceUsd?: number; processingTime?: number; bytesSent?: number; bytesReceived?: number } | null>(null);
const prevStatusRef = useRef<string>('idle');
const statusRef = useRef<string>('idle');
const statusChangedFromRunningAtRef = useRef<number | null>(null);
const lastRenderedTimestampRef = useRef<number | null>(null);
const { startWorkflow, stopWorkflow, startingWorkflow, stoppingWorkflows } = useWorkflowOperations();
const { request } = useApiRequest();
const pollingController = useWorkflowPolling();
// Store polling controller methods in refs to avoid dependency issues
const pollingControllerRef = useRef(pollingController);
pollingControllerRef.current = pollingController;
// Helper to update status and track transitions
const updateWorkflowStatus = useCallback((newStatus: string) => {
@ -45,81 +62,411 @@ export function useWorkflowLifecycle() {
const setWorkflowStatusOptimistic = useCallback((status: string) => {
updateWorkflowStatus(status);
}, [updateWorkflowStatus]);
const loadWorkflowData = useCallback(async (id: string) => {
// Convert backend log format to frontend format
const convertLogToFrontendFormat = useCallback((log: any): WorkflowLog => {
return {
id: log.id,
workflowId: log.workflowId || workflowId || '',
message: log.message || '',
type: log.type || 'info',
timestamp: log.timestamp || log.createdAt || Date.now(),
status: log.status || 'running',
progress: log.progress !== undefined && log.progress !== null ? log.progress : undefined,
performance: log.performance,
operationId: log.operationId || null,
parentId: log.parentId || null
};
}, [workflowId]);
// Process unified chat data chronologically
const processUnifiedChatData = useCallback((chatData: { messages: WorkflowMessage[]; logs: WorkflowLog[]; stats: any[] }) => {
console.log('🔄 Processing unified chat data:', {
messages: chatData.messages?.length || 0,
logs: chatData.logs?.length || 0,
stats: chatData.stats?.length || 0
});
// Build unified timeline of all items
const timeline: UnifiedChatDataItem[] = [];
// Add messages
(chatData.messages || []).forEach((message: WorkflowMessage) => {
timeline.push({
type: 'message',
item: message,
createdAt: message.publishedAt || message.timestamp || Date.now()
});
});
// Add logs
(chatData.logs || []).forEach((log: any) => {
timeline.push({
type: 'log',
item: log,
createdAt: log.timestamp || log.createdAt || Date.now()
});
});
// Add stats (if needed)
(chatData.stats || []).forEach((stat: any) => {
timeline.push({
type: 'stat',
item: stat,
createdAt: stat.timestamp || stat.createdAt || Date.now()
});
});
console.log('📋 Timeline created with', timeline.length, 'items');
// Sort chronologically
timeline.sort((a, b) => a.createdAt - b.createdAt);
// Process items sequentially to maintain chronological order
// Update lastRenderedTimestamp after processing all items (use latest timestamp)
if (timeline.length > 0) {
const latestTimestamp = timeline[timeline.length - 1].createdAt;
lastRenderedTimestampRef.current = latestTimestamp;
}
// Use functional updates to avoid dependency on current state
setMessages(prevMessages => {
const newMessages: WorkflowMessage[] = [...prevMessages];
let hasChanges = false;
let messagesAdded = 0;
let messagesUpdated = 0;
timeline.forEach((item) => {
if (item.type === 'message') {
const message = item.item as WorkflowMessage;
if (!message || !message.id) {
console.warn('⚠️ Invalid message in timeline:', message);
return;
}
// Check if message already exists
const existingIndex = newMessages.findIndex(m => m.id === message.id);
if (existingIndex >= 0) {
// Always update existing message (don't compare, just update)
newMessages[existingIndex] = message;
hasChanges = true;
messagesUpdated++;
} else {
newMessages.push(message);
hasChanges = true;
messagesAdded++;
}
}
});
console.log(`📨 Messages: ${messagesAdded} added, ${messagesUpdated} updated, total: ${newMessages.length}`);
if (messagesAdded > 0 || messagesUpdated > 0) {
console.log('📨 Sample messages:', newMessages.slice(0, 3).map(m => ({ id: m.id, message: m.message?.substring(0, 50) })));
}
// Always return sorted array if we processed any messages
if (hasChanges || timeline.some(item => item.type === 'message')) {
const sorted = [...newMessages].sort(sortMessages);
console.log(`✅ Returning ${sorted.length} sorted messages`);
return sorted;
}
console.log('⚠️ No changes detected, returning previous messages');
return prevMessages;
});
setDashboardLogs(prevDashboardLogs => {
const newDashboardLogs: WorkflowLog[] = [...prevDashboardLogs];
let hasChanges = false;
timeline.forEach((item) => {
if (item.type === 'log') {
const backendLog = item.item as any;
const frontendLog = convertLogToFrontendFormat(backendLog);
// Route logs based on operationId
if (frontendLog.operationId) {
// Logs WITH operationId → Dashboard
const existingIndex = newDashboardLogs.findIndex(l => l.id === frontendLog.id);
if (existingIndex >= 0) {
// Check if log actually changed
const existingLog = newDashboardLogs[existingIndex];
if (JSON.stringify(existingLog) !== JSON.stringify(frontendLog)) {
newDashboardLogs[existingIndex] = frontendLog;
hasChanges = true;
}
} else {
newDashboardLogs.push(frontendLog);
hasChanges = true;
}
}
}
});
// Only return new array if there are changes
if (!hasChanges) {
return prevDashboardLogs;
}
return [...newDashboardLogs].sort(sortLogs);
});
setUnifiedContentLogs(prevUnifiedContentLogs => {
const newUnifiedContentLogs: WorkflowLog[] = [...prevUnifiedContentLogs];
let hasChanges = false;
timeline.forEach((item) => {
if (item.type === 'log') {
const backendLog = item.item as any;
const frontendLog = convertLogToFrontendFormat(backendLog);
// Route logs based on operationId
if (!frontendLog.operationId) {
// Logs WITHOUT operationId → Unified Content Area
const existingIndex = newUnifiedContentLogs.findIndex(l => l.id === frontendLog.id);
if (existingIndex >= 0) {
// Check if log actually changed
const existingLog = newUnifiedContentLogs[existingIndex];
if (JSON.stringify(existingLog) !== JSON.stringify(frontendLog)) {
newUnifiedContentLogs[existingIndex] = frontendLog;
hasChanges = true;
}
} else {
newUnifiedContentLogs.push(frontendLog);
hasChanges = true;
}
}
}
});
// Only return new array if there are changes
if (!hasChanges) {
return prevUnifiedContentLogs;
}
return [...newUnifiedContentLogs].sort(sortLogs);
});
// Update combined logs for backward compatibility (using functional update)
setLogs(prevLogs => {
const allLogs: WorkflowLog[] = [...prevLogs];
timeline.forEach((item) => {
if (item.type === 'log') {
const backendLog = item.item as any;
const frontendLog = convertLogToFrontendFormat(backendLog);
const existingIndex = allLogs.findIndex(l => l.id === frontendLog.id);
if (existingIndex >= 0) {
allLogs[existingIndex] = frontendLog;
} else {
allLogs.push(frontendLog);
}
}
});
return [...allLogs].sort(sortLogs);
});
// Process stats and keep the latest one (highest createdAt)
const statsItems = timeline.filter(item => item.type === 'stat');
if (statsItems.length > 0) {
// Sort by createdAt descending to get the latest
const sortedStats = [...statsItems].sort((a, b) => b.createdAt - a.createdAt);
const latestStatItem = sortedStats[0];
const statData = latestStatItem.item || latestStatItem;
if (statData && (statData.priceUsd !== undefined || statData.processingTime !== undefined ||
statData.bytesSent !== undefined || statData.bytesReceived !== undefined)) {
setLatestStats({
priceUsd: statData.priceUsd,
processingTime: statData.processingTime,
bytesSent: statData.bytesSent,
bytesReceived: statData.bytesReceived
});
}
}
}, [convertLogToFrontendFormat]);
// Poll workflow data using unified chat data endpoint
const pollWorkflowData = useCallback(async (id: string) => {
try {
// Determine afterTimestamp for incremental polling
const afterTimestamp = lastRenderedTimestampRef.current || undefined;
// Fetch workflow status
const workflowData = await fetchWorkflowApi(request, id).catch(() => null);
if (workflowData) {
const status = workflowData.status || 'idle';
const round = workflowData.currentRound !== undefined ? workflowData.currentRound : undefined;
updateWorkflowStatus(status);
setCurrentRound(round);
}
// Fetch unified chat data
const chatData = await fetchChatData(request, id, afterTimestamp);
console.log('📊 Processed chat data:', {
messagesCount: chatData.messages?.length || 0,
logsCount: chatData.logs?.length || 0,
statsCount: chatData.stats?.length || 0,
afterTimestamp: afterTimestamp
});
// If we got empty results and we're using afterTimestamp, the backend might be filtering incorrectly
// Reset timestamp to null so next poll fetches all items (but only if we have existing data)
const hasNoNewData = (chatData.messages?.length || 0) === 0 &&
(chatData.logs?.length || 0) === 0 &&
(chatData.stats?.length || 0) === 0;
// Only reset if we're using afterTimestamp and got empty results
// This handles cases where backend filtering might miss items due to timestamp precision issues
if (hasNoNewData && afterTimestamp !== undefined && lastRenderedTimestampRef.current !== null) {
console.warn('⚠️ Got empty results with afterTimestamp, resetting timestamp for next poll');
// Don't reset immediately - let this poll complete, next poll will fetch all
lastRenderedTimestampRef.current = null;
}
// Process unified chat data
processUnifiedChatData(chatData);
// Determine if polling should continue
const currentStatus = statusRef.current;
// Stop polling immediately for failed or stopped workflows
// For completed workflows, allow grace period (handled by useEffect)
if (currentStatus === 'failed' || currentStatus === 'stopped') {
pollingControllerRef.current.stopPolling();
return;
}
// Continue polling for 'running' status
// For 'completed' status, continue if within grace period (handled by useEffect)
// Polling will be stopped by the useEffect when grace period expires or status changes to failed/stopped
} catch (error: any) {
// Handle rate limiting (429 errors)
if (error?.status === 429 || error?.response?.status === 429) {
throw error; // Let polling controller handle rate limit backoff
}
console.error('Error polling workflow data:', error);
// Don't throw for other errors - allow polling to continue with backoff
}
}, [request, updateWorkflowStatus, processUnifiedChatData]);
// Load initial workflow data (non-polling)
const _loadWorkflowData = useCallback(async (id: string) => {
try {
const workflowData = await fetchWorkflowApi(request, id).catch(() => null);
if (!workflowData) {
setMessages([]);
setLogs([]);
setDashboardLogs([]);
setUnifiedContentLogs([]);
setLatestStats(null);
return;
}
const messagesData = Array.isArray(workflowData.messages) ? workflowData.messages : [];
const logsData = Array.isArray(workflowData.logs) ? workflowData.logs : [];
const status = workflowData.status || 'idle';
const round = workflowData.currentRound !== undefined ? workflowData.currentRound : undefined;
if (messagesData.length > 0) {
setMessages([...messagesData].sort(sortMessages));
} else {
setMessages([]);
}
if (logsData.length > 0) {
setLogs([...logsData].sort(sortLogs));
} else {
setLogs([]);
}
// Update status and track transitions
updateWorkflowStatus(status);
} catch (error) {
}
}, [request, updateWorkflowStatus]);
setCurrentRound(round);
// Always fetch unified chat data to get all messages and logs
// Reset lastRenderedTimestamp to fetch all historical data
lastRenderedTimestampRef.current = null;
try {
const chatData = await fetchChatData(request, id, undefined);
console.log('📥 loadWorkflowData: Fetched unified chat data:', {
messagesCount: chatData.messages?.length || 0,
logsCount: chatData.logs?.length || 0
});
processUnifiedChatData(chatData);
} catch (error) {
console.warn('⚠️ Failed to fetch unified chat data, falling back to workflowData:', error);
// Fallback to workflowData if unified chat data fails
if (messagesData.length > 0) {
setMessages([...messagesData].sort(sortMessages));
}
// Process logs and separate by operationId
const dashboardLogsList: WorkflowLog[] = [];
const unifiedContentLogsList: WorkflowLog[] = [];
logsData.forEach((log: any) => {
const frontendLog = convertLogToFrontendFormat(log);
if (frontendLog.operationId) {
dashboardLogsList.push(frontendLog);
} else {
unifiedContentLogsList.push(frontendLog);
}
});
setDashboardLogs(dashboardLogsList.sort(sortLogs));
setUnifiedContentLogs(unifiedContentLogsList.sort(sortLogs));
setLogs([...dashboardLogsList, ...unifiedContentLogsList].sort(sortLogs));
}
} catch (error) {
console.error('Error loading workflow data:', error);
}
}, [request, updateWorkflowStatus, convertLogToFrontendFormat, processUnifiedChatData]);
void _loadWorkflowData; // Intentionally unused, reserved for future use
// Set up polling when workflow is running
useEffect(() => {
if (!workflowId) {
setMessages([]);
setLogs([]);
setCurrentRound(undefined);
setStatusChangedFromRunningAt(null);
statusChangedFromRunningAtRef.current = null;
// Only clear state if not already cleared to avoid unnecessary updates
setMessages(prev => prev.length > 0 ? [] : prev);
setLogs(prev => prev.length > 0 ? [] : prev);
setDashboardLogs(prev => prev.length > 0 ? [] : prev);
setUnifiedContentLogs(prev => prev.length > 0 ? [] : prev);
setLatestStats(null);
setCurrentRound(prev => prev !== undefined ? undefined : prev);
if (statusChangedFromRunningAt !== null) {
setStatusChangedFromRunningAt(null);
statusChangedFromRunningAtRef.current = null;
}
lastRenderedTimestampRef.current = null;
pollingControllerRef.current.stopPolling();
return;
}
// Continue polling if:
// 1. Workflow is currently running, OR
// 2. Workflow just stopped running (within last 5 seconds) - grace period to catch final messages
// 2. Workflow just completed (within last 10 seconds) - grace period to catch final messages
// Stop polling for failed or stopped workflows immediately
// Use ref for statusChangedFromRunningAt to get latest value (state updates are async)
const changedAtRef = statusChangedFromRunningAtRef.current;
const shouldPoll = workflowStatus === 'running' ||
(statusChangedFromRunningAt !== null && Date.now() - statusChangedFromRunningAt < 5000);
(workflowStatus === 'completed' && changedAtRef !== null && Date.now() - changedAtRef < 10000);
if (shouldPoll) {
// Load immediately when status becomes running or when in grace period
loadWorkflowData(workflowId);
// Poll more frequently for smoother updates (every 1 second instead of 2)
const intervalId = setInterval(() => {
// Check grace period on each poll using refs to get latest values
const currentStatus = statusRef.current;
const changedAt = statusChangedFromRunningAtRef.current;
const stillInGracePeriod = currentStatus === 'running' ||
(changedAt !== null && Date.now() - changedAt < 5000);
if (stillInGracePeriod) {
loadWorkflowData(workflowId);
}
}, 1000);
return () => {
clearInterval(intervalId);
};
// Reset lastRenderedTimestamp for first poll (fetch all historical data)
if (lastRenderedTimestampRef.current === null) {
lastRenderedTimestampRef.current = null; // null means fetch all
}
// Start polling
pollingControllerRef.current.startPolling(workflowId, pollWorkflowData);
} else {
// Clear the status change timestamp when we stop polling
setStatusChangedFromRunningAt(null);
statusChangedFromRunningAtRef.current = null;
// Stop polling for failed, stopped, or completed (after grace period) workflows
pollingControllerRef.current.stopPolling();
// Clear the status change timestamp when we stop polling (only if not already null)
if (statusChangedFromRunningAt !== null) {
setStatusChangedFromRunningAt(null);
statusChangedFromRunningAtRef.current = null;
}
}
}, [workflowStatus, workflowId, loadWorkflowData, statusChangedFromRunningAt]);
return () => {
pollingControllerRef.current.stopPolling();
};
}, [workflowStatus, workflowId, pollWorkflowData]);
const handleStartWorkflow = useCallback(async (
workflowData: StartWorkflowRequest,
@ -132,6 +479,8 @@ export function useWorkflowLifecycle() {
const workflow = result.data as Workflow;
setWorkflowId(workflow.id);
updateWorkflowStatus(workflow.status || 'running');
// Reset lastRenderedTimestamp for new workflow
lastRenderedTimestampRef.current = null;
return { success: true, data: result.data };
} else {
return { success: false, error: result.error || 'Failed to start workflow' };
@ -166,19 +515,27 @@ export function useWorkflowLifecycle() {
statusRef.current = 'idle';
updateWorkflowStatus('idle');
setCurrentRound(undefined);
setLatestStats(null);
setStatusChangedFromRunningAt(null);
statusChangedFromRunningAtRef.current = null;
lastRenderedTimestampRef.current = null;
pollingControllerRef.current.stopPolling();
}, [updateWorkflowStatus]);
const selectWorkflow = useCallback(async (workflowIdToSelect: string) => {
try {
setWorkflowId(workflowIdToSelect);
// Reset lastRenderedTimestamp for new workflow selection
lastRenderedTimestampRef.current = null;
const workflowData = await fetchWorkflowApi(request, workflowIdToSelect).catch(() => null);
if (!workflowData) {
setMessages([]);
setLogs([]);
setDashboardLogs([]);
setUnifiedContentLogs([]);
setLatestStats(null);
updateWorkflowStatus('idle');
return;
}
@ -191,23 +548,46 @@ export function useWorkflowLifecycle() {
updateWorkflowStatus(status);
setCurrentRound(round);
if (messagesData.length > 0) {
setMessages([...messagesData].sort(sortMessages));
} else {
setMessages([]);
}
// Always fetch unified chat data to get all messages and logs (regardless of status)
// This ensures completed workflows also show their logs
try {
const chatData = await fetchChatData(request, workflowIdToSelect, undefined);
console.log('📥 selectWorkflow: Fetched unified chat data:', {
messagesCount: chatData.messages?.length || 0,
logsCount: chatData.logs?.length || 0,
status
});
processUnifiedChatData(chatData);
} catch (error) {
console.warn('⚠️ Failed to fetch unified chat data, falling back to workflowData:', error);
// Fallback to workflowData if unified chat data fails
if (messagesData.length > 0) {
setMessages([...messagesData].sort(sortMessages));
}
// Process logs and separate by operationId
const dashboardLogsList: WorkflowLog[] = [];
const unifiedContentLogsList: WorkflowLog[] = [];
logsData.forEach((log: any) => {
const frontendLog = convertLogToFrontendFormat(log);
if (frontendLog.operationId) {
dashboardLogsList.push(frontendLog);
} else {
unifiedContentLogsList.push(frontendLog);
}
});
if (logsData.length > 0) {
setLogs([...logsData].sort(sortLogs));
} else {
setLogs([]);
setDashboardLogs(dashboardLogsList.sort(sortLogs));
setUnifiedContentLogs(unifiedContentLogsList.sort(sortLogs));
setLogs([...dashboardLogsList, ...unifiedContentLogsList].sort(sortLogs));
}
// If workflow is running, start polling immediately
// The useEffect will handle the polling setup
// If workflow is running, polling will start automatically via useEffect
} catch (error) {
console.error('Error selecting workflow:', error);
}
}, [request, updateWorkflowStatus]);
}, [request, updateWorkflowStatus, convertLogToFrontendFormat, processUnifiedChatData]);
const isRunning = workflowStatus === 'running';
const isStopping = workflowId ? stoppingWorkflows.has(workflowId) : false;
@ -221,6 +601,9 @@ export function useWorkflowLifecycle() {
startingWorkflow,
messages,
logs,
dashboardLogs,
unifiedContentLogs,
latestStats,
startWorkflow: handleStartWorkflow,
stopWorkflow: handleStopWorkflow,
resetWorkflow,
@ -228,4 +611,3 @@ export function useWorkflowLifecycle() {
setWorkflowStatusOptimistic
};
}

View file

@ -0,0 +1,205 @@
import { useRef, useCallback } from 'react';
interface PollingState {
activeWorkflowId: string | null;
isPolling: boolean;
isPollInProgress: boolean;
isPaused: boolean;
currentInterval: number;
failureCount: number;
rateLimitFailureCount: number;
timeoutId: NodeJS.Timeout | null;
}
const BASE_INTERVAL = 5000; // 5 seconds
const MAX_INTERVAL = 10000; // 10 seconds
const BACKOFF_MULTIPLIER = 1.5;
const RATE_LIMIT_BACKOFF_MULTIPLIER = 2.0;
const MAX_RATE_LIMIT_FAILURES = 5;
export type PollCallback = (workflowId: string) => Promise<void>;
export function useWorkflowPolling() {
const stateRef = useRef<PollingState>({
activeWorkflowId: null,
isPolling: false,
isPollInProgress: false,
isPaused: false,
currentInterval: BASE_INTERVAL,
failureCount: 0,
rateLimitFailureCount: 0,
timeoutId: null
});
const pollCallbackRef = useRef<PollCallback | null>(null);
const calculateInterval = useCallback((isRateLimit: boolean = false): number => {
const state = stateRef.current;
const multiplier = isRateLimit ? RATE_LIMIT_BACKOFF_MULTIPLIER : BACKOFF_MULTIPLIER;
const newInterval = Math.min(
BASE_INTERVAL * Math.pow(multiplier, state.failureCount),
MAX_INTERVAL
);
return Math.floor(newInterval);
}, []);
const scheduleNextPoll = useCallback((interval: number) => {
const state = stateRef.current;
// Clear any existing timeout
if (state.timeoutId) {
clearTimeout(state.timeoutId);
state.timeoutId = null;
}
// Don't schedule if not polling or paused
if (!state.isPolling || state.isPaused || !state.activeWorkflowId) {
return;
}
// Schedule next poll
state.timeoutId = setTimeout(() => {
state.timeoutId = null;
doPolling();
}, interval);
}, []);
const doPolling = useCallback(async () => {
const state = stateRef.current;
// Prevent concurrent polls
if (state.isPollInProgress) {
return;
}
// Validate workflow is still active
if (!state.activeWorkflowId || !state.isPolling || state.isPaused) {
return;
}
const workflowId = state.activeWorkflowId;
state.isPollInProgress = true;
try {
if (pollCallbackRef.current) {
await pollCallbackRef.current(workflowId);
}
// Success - reset failure counts and interval
state.failureCount = 0;
state.rateLimitFailureCount = 0;
state.currentInterval = BASE_INTERVAL;
// Schedule next poll
scheduleNextPoll(state.currentInterval);
} catch (error: any) {
// Handle errors
const isRateLimit = error?.status === 429 || error?.response?.status === 429;
if (isRateLimit) {
state.rateLimitFailureCount++;
// Stop polling after too many rate limit errors
if (state.rateLimitFailureCount >= MAX_RATE_LIMIT_FAILURES) {
console.error('Too many rate limit errors, stopping polling');
stopPolling();
return;
}
} else {
state.rateLimitFailureCount = 0; // Reset rate limit count on non-rate-limit errors
}
state.failureCount++;
const nextInterval = calculateInterval(isRateLimit);
state.currentInterval = nextInterval;
console.warn(`Polling error (attempt ${state.failureCount}):`, error);
// Schedule next poll with backoff
scheduleNextPoll(nextInterval);
} finally {
state.isPollInProgress = false;
}
}, [scheduleNextPoll, calculateInterval]);
const startPolling = useCallback((workflowId: string, callback: PollCallback) => {
const state = stateRef.current;
// Stop any existing polling
if (state.isPolling) {
stopPolling();
}
// Validate workflow ID
if (!workflowId || typeof workflowId !== 'string') {
console.error('Invalid workflow ID for polling:', workflowId);
return;
}
// Set up polling state
state.activeWorkflowId = workflowId;
state.isPolling = true;
state.isPaused = false;
state.failureCount = 0;
state.rateLimitFailureCount = 0;
state.currentInterval = BASE_INTERVAL;
pollCallbackRef.current = callback;
// Execute immediate first poll (no delay)
doPolling();
}, [doPolling]);
const stopPolling = useCallback(() => {
const state = stateRef.current;
// Clear timeout
if (state.timeoutId) {
clearTimeout(state.timeoutId);
state.timeoutId = null;
}
// Reset state
state.isPolling = false;
state.isPollInProgress = false;
state.activeWorkflowId = null;
state.failureCount = 0;
state.rateLimitFailureCount = 0;
state.currentInterval = BASE_INTERVAL;
state.isPaused = false;
pollCallbackRef.current = null;
}, []);
const pausePolling = useCallback(() => {
const state = stateRef.current;
state.isPaused = true;
}, []);
const resumePolling = useCallback(() => {
const state = stateRef.current;
if (state.isPolling && state.isPaused) {
state.isPaused = false;
// Resume polling immediately
if (!state.isPollInProgress) {
scheduleNextPoll(0);
}
}
}, [scheduleNextPoll]);
const isPolling = useCallback((): boolean => {
return stateRef.current.isPolling && !stateRef.current.isPaused;
}, []);
const getActiveWorkflowId = useCallback((): string | null => {
return stateRef.current.activeWorkflowId;
}, []);
return {
startPolling,
stopPolling,
pausePolling,
resumePolling,
isPolling,
getActiveWorkflowId
};
}

View file

@ -16,9 +16,19 @@ export function useWorkflows() {
setLoading(true);
setError(null);
console.log('🔄 useWorkflows: Fetching workflows from API...');
const workflowList = await fetchWorkflowsApi(request);
setWorkflows(workflowList);
console.log('✅ useWorkflows: Fetched workflows:', workflowList);
if (Array.isArray(workflowList)) {
setWorkflows(workflowList);
console.log(`✅ useWorkflows: Set ${workflowList.length} workflows in state`);
} else {
console.warn('⚠️ useWorkflows: API returned non-array data:', workflowList);
setWorkflows([]);
}
} catch (error: any) {
console.error('❌ useWorkflows: Error fetching workflows:', error);
setError(error.message || 'Failed to fetch workflows');
setWorkflows([]);
} finally {
@ -39,9 +49,16 @@ export function useWorkflows() {
}
};
const handleWorkflowCreated = () => {
// Immediately refetch workflows list to include the newly created workflow
fetchWorkflows();
};
window.addEventListener('workflowDeleted', handleWorkflowDeleted as EventListener);
window.addEventListener('workflowCreated', handleWorkflowCreated as EventListener);
return () => {
window.removeEventListener('workflowDeleted', handleWorkflowDeleted as EventListener);
window.removeEventListener('workflowCreated', handleWorkflowCreated as EventListener);
};
}, [fetchWorkflows, selectedWorkflowId, clearWorkflow]);

View file

@ -4,7 +4,7 @@ import { useMsal } from '@azure/msal-react';
import api from '../api';
import { useApiRequest } from './useApi';
import { getApiBaseUrl } from '../../config/config';
import { setUserDataCache, clearUserDataCache } from '../utils/userCache';
import { setUserDataCache, clearUserDataCache, type CachedUserData } from '../utils/userCache';
import {
loginApi,
fetchCurrentUserApi,
@ -44,7 +44,7 @@ export function useAuth() {
if (userData) {
// Cache user data in sessionStorage (cleared on tab close - more secure than localStorage)
setUserDataCache(userData);
setUserDataCache(userData as CachedUserData);
}
} catch (userError) {
console.error('Failed to fetch user data after login:', userError);
@ -171,7 +171,7 @@ export function useMsalAuth() {
try {
const userData = await fetchCurrentUserApi('msft');
if (userData) {
setUserDataCache(userData);
setUserDataCache(userData as CachedUserData);
}
} catch (userError) {
console.error('Failed to fetch user data after Microsoft login:', userError);
@ -349,7 +349,7 @@ export function useGoogleAuth() {
try {
const userData = await fetchCurrentUserApi('google');
if (userData) {
setUserDataCache(userData);
setUserDataCache(userData as CachedUserData);
}
} catch (userError) {
console.error('Failed to fetch user data after Google login:', userError);
@ -652,7 +652,7 @@ export function useCurrentUser() {
setUser(userData);
// Cache user data in sessionStorage (cleared on tab close - more secure than localStorage)
setUserDataCache(userData);
setUserDataCache(userData as CachedUserData);
return userData;
} catch (error: any) {

View file

@ -7,14 +7,12 @@ import { getUserDataCache } from '../utils/userCache';
import { useApiRequest } from './useApi';
import { usePermissions, type UserPermissions } from './usePermissions';
import {
fetchFileAttributes,
fetchFileAttributes as _fetchFileAttributes,
fetchFiles as fetchFilesApi,
fetchFileById as fetchFileByIdApi,
updateFile as updateFileApi,
deleteFile as deleteFileApi,
deleteFiles as deleteFilesApi,
type AttributeDefinition,
type PaginationParams
deleteFiles as deleteFilesApi
} from '../api/fileApi';
// File interfaces - exactly matching backend FileItem model
@ -32,7 +30,7 @@ export interface FileInfo {
// Field names come directly from backend attributes
export type UserFile = any;
// Attribute definition interface
// Attribute definition interface (local definition, not imported to avoid conflicts)
export interface AttributeDefinition {
name: string;
label: string;
@ -46,7 +44,7 @@ export interface AttributeDefinition {
filterOptions?: string[]; // For enum types
}
// Pagination parameters
// Pagination parameters (local definition, not imported to avoid conflicts)
export interface PaginationParams {
page?: number;
pageSize?: number;
@ -129,8 +127,7 @@ export function useUserFiles() {
if (!cachedUser) {
// User is not authenticated, skip fetching files
setFiles([]);
setLoading(false);
setError(null);
// Note: loading and error are managed by useApiRequest hook
return;
}

View file

@ -54,6 +54,27 @@ export interface ParcelSearchResponse {
id: string;
egrid?: string;
number?: string;
perimeter?: {
closed: boolean;
punkte: Array<{
koordinatensystem: string;
x: number;
y: number;
z: number | null;
}>;
};
geometry_geojson?: {
type: string;
geometry: {
type: string;
coordinates: number[][][];
};
properties: {
id: string;
egrid?: string;
number?: string;
};
};
}>;
}
@ -285,92 +306,85 @@ export function usePek() {
}
// Adjacent parcels (if available)
// Fetch geometries for adjacent parcels
// Use geometries from the response (no need to fetch separately)
if (data.adjacent_parcels && includeAdjacent && data.adjacent_parcels.length > 0) {
// Fetch geometries for each adjacent parcel
const adjacentPromises = data.adjacent_parcels.map(async (adjacent) => {
try {
// Search for the adjacent parcel by its ID or EGRID
const searchLocation = adjacent.egrid || adjacent.id || adjacent.number;
if (!searchLocation) {
if (import.meta.env.DEV) {
console.warn(`⚠️ Adjacent parcel ${adjacent.id} has no search location`);
}
return null;
}
const adjacentGeometries: ParcelGeometry[] = [];
if (import.meta.env.DEV) {
console.log(`🔍 Fetching geometry for adjacent parcel: ${searchLocation}`);
}
const adjResponse = await api.get('/api/realestate/parcel/search', {
params: {
location: searchLocation,
include_adjacent: false // Don't fetch adjacent of adjacent
}
data.adjacent_parcels.forEach((adjacent) => {
if (import.meta.env.DEV) {
console.log(`🔍 Processing adjacent parcel ${adjacent.id}:`, {
hasGeometryGeoJson: !!adjacent.geometry_geojson,
hasPerimeter: !!adjacent.perimeter,
geometryGeoJson: adjacent.geometry_geojson,
perimeter: adjacent.perimeter
});
}
const adjData: ParcelSearchResponse = adjResponse.data;
let adjCoordinates: MapPoint[] = [];
let adjCoordinates: MapPoint[] = [];
// Extract coordinates from adjacent parcel
if (adjData.map_view?.geometry_geojson?.geometry?.coordinates) {
const coords = adjData.map_view.geometry_geojson.geometry.coordinates[0];
if (Array.isArray(coords)) {
adjCoordinates = coords.map((coord: number[]) => ({
x: coord[0],
y: coord[1]
}));
}
} else if (adjData.parcel.perimeter?.punkte) {
adjCoordinates = adjData.parcel.perimeter.punkte.map((p) => ({
x: p.x,
y: p.y
// Extract coordinates from geometry_geojson if available
if (adjacent.geometry_geojson?.geometry?.coordinates) {
const coords = adjacent.geometry_geojson.geometry.coordinates[0];
if (Array.isArray(coords) && coords.length > 0) {
adjCoordinates = coords.map((coord: number[]) => ({
x: coord[0],
y: coord[1]
}));
if (import.meta.env.DEV) {
console.log(`✅ Extracted ${adjCoordinates.length} coordinates from geometry_geojson for ${adjacent.id}`);
}
}
}
// Fallback to perimeter.punkte if available
else if (adjacent.perimeter?.punkte) {
adjCoordinates = adjacent.perimeter.punkte.map((p) => ({
x: p.x,
y: p.y
}));
if (import.meta.env.DEV) {
console.log(`✅ Fetched ${adjCoordinates.length} coordinates for adjacent parcel ${adjacent.id}`);
console.log(`Extracted ${adjCoordinates.length} coordinates from perimeter for ${adjacent.id}`);
}
}
return {
// Only add if we have valid coordinates
if (adjCoordinates.length >= 3) {
adjacentGeometries.push({
id: adjacent.id,
egrid: adjacent.egrid,
number: adjacent.number,
coordinates: adjCoordinates,
isSelected: false,
isAdjacent: true
};
} catch (err) {
// If fetching fails, log error but don't add parcel
if (import.meta.env.DEV) {
console.error(`❌ Failed to fetch geometry for adjacent parcel ${adjacent.id}:`, err);
}
return null;
});
} else if (import.meta.env.DEV) {
console.warn(`⚠️ Adjacent parcel ${adjacent.id} has insufficient geometry data:`, {
coordCount: adjCoordinates.length,
hasGeometryGeoJson: !!adjacent.geometry_geojson,
hasPerimeter: !!adjacent.perimeter,
geometryGeoJsonStructure: adjacent.geometry_geojson ? {
hasGeometry: !!adjacent.geometry_geojson.geometry,
hasCoordinates: !!adjacent.geometry_geojson.geometry?.coordinates,
coordinatesLength: adjacent.geometry_geojson.geometry?.coordinates?.length,
firstCoordLength: adjacent.geometry_geojson.geometry?.coordinates?.[0]?.length
} : null
});
}
});
// Wait for all adjacent parcel geometries
const adjacentGeometries = await Promise.all(adjacentPromises);
const validAdjacentGeometries = adjacentGeometries.filter(
(g): g is ParcelGeometry => g !== null && g.coordinates.length >= 3
);
if (import.meta.env.DEV) {
console.log(`📦 Adjacent parcels summary:`, {
requested: data.adjacent_parcels.length,
fetched: adjacentGeometries.filter(g => g !== null).length,
valid: validAdjacentGeometries.length,
geometries: validAdjacentGeometries.map(g => ({
valid: adjacentGeometries.length,
geometries: adjacentGeometries.map(g => ({
id: g.id,
number: g.number,
coordCount: g.coordinates.length
}))
});
}
// Add adjacent parcels to geometries array
geometries.push(...validAdjacentGeometries);
geometries.push(...adjacentGeometries);
}
// Update parcel geometries with all parcels (main + adjacent)
@ -430,20 +444,47 @@ export function usePek() {
);
/**
* Handle parcel click on map
* Handle parcel click on map - select the clicked parcel
*/
const handleParcelClick = useCallback(async (parcelId: string) => {
// Re-search for this specific parcel with adjacent parcels
if (selectedParcel) {
const locationString = selectedParcel.parcel.centroid
? `${selectedParcel.parcel.centroid.x},${selectedParcel.parcel.centroid.y}`
: locationInput;
await searchParcel(locationString, true);
// Find the clicked parcel in the geometries
const clickedParcel = parcelGeometries.find(p => p.id === parcelId);
if (clickedParcel && clickedParcel.coordinates.length > 0) {
// Use a point inside the parcel (first coordinate is always on the boundary, which is inside)
// For better accuracy, use a point slightly inside the boundary
const firstCoord = clickedParcel.coordinates[0];
// Calculate centroid as fallback, but prefer a point we know is inside
// const sumX = clickedParcel.coordinates.reduce((sum, coord) => sum + coord.x, 0);
// const sumY = clickedParcel.coordinates.reduce((sum, coord) => sum + coord.y, 0);
// const _centroidX = sumX / clickedParcel.coordinates.length;
// const _centroidY = sumY / clickedParcel.coordinates.length;
// Use first coordinate (guaranteed to be on/in the parcel) for search
const locationString = `${firstCoord.x},${firstCoord.y}`;
await searchParcel(locationString, true); // Always include adjacent parcels
} else {
// Fallback: try to search by parcel ID/EGRID if available
if (selectedParcel?.adjacent_parcels) {
const adjacentParcel = selectedParcel.adjacent_parcels.find(p => p.id === parcelId);
if (adjacentParcel?.egrid) {
// Search by EGRID
await searchParcel(adjacentParcel.egrid, true);
} else if (adjacentParcel?.number) {
// Try searching by number (might need address context)
await searchParcel(adjacentParcel.number, true);
} else if (adjacentParcel?.id) {
// Last resort: try searching by ID
await searchParcel(adjacentParcel.id, true);
}
}
}
}, [selectedParcel, locationInput, searchParcel]);
}, [parcelGeometries, selectedParcel, searchParcel]);
/**
* Process natural language command
* Always includes the currently selected parcel if available
*/
const processCommand = useCallback(async (userInput: string) => {
if (!userInput.trim()) {
@ -464,9 +505,34 @@ export function usePek() {
setCommandResults((prev) => [...prev, userMessage]);
try {
const response = await api.post('/api/realestate/command', {
// Build request body with user input and selected parcel
const requestBody: any = {
userInput: userInput.trim()
});
};
// Always include the currently selected parcel if available
if (selectedParcel) {
requestBody.selectedParcel = {
id: selectedParcel.parcel.id,
egrid: selectedParcel.parcel.egrid,
number: selectedParcel.parcel.number,
name: selectedParcel.parcel.name,
identnd: selectedParcel.parcel.identnd,
canton: selectedParcel.parcel.canton,
municipality_code: selectedParcel.parcel.municipality_code,
municipality_name: selectedParcel.parcel.municipality_name,
address: selectedParcel.parcel.address,
area_m2: selectedParcel.parcel.area_m2,
centroid: selectedParcel.parcel.centroid,
geoportal_url: selectedParcel.parcel.geoportal_url,
realestate_type: selectedParcel.parcel.realestate_type,
// Include geometry data if available
geometry_geojson: selectedParcel.map_view?.geometry_geojson,
perimeter: selectedParcel.parcel.perimeter
};
}
const response = await api.post('/api/realestate/command', requestBody);
const data: CommandResponse = response.data;
@ -494,6 +560,172 @@ export function usePek() {
};
setCommandResults((prev) => [...prev, assistantMessage]);
// If a project was created and there's a selected parcel, automatically add it
if (data.success && data.intent === 'CREATE' && data.entity === 'Projekt' && selectedParcel) {
try {
// Extract projekt from result
const projektResult = data.result?.result || data.result;
if (projektResult?.id) {
// Set as current projekt
setCurrentProjekt(projektResult);
// Add the selected parcel to the newly created project via direct API call
const addParcelRequestBody: any = {
parcelId: selectedParcel.parcel.id,
parcelData: {
id: selectedParcel.parcel.id,
egrid: selectedParcel.parcel.egrid,
number: selectedParcel.parcel.number,
name: selectedParcel.parcel.name,
identnd: selectedParcel.parcel.identnd,
canton: selectedParcel.parcel.canton,
municipality_code: selectedParcel.parcel.municipality_code,
municipality_name: selectedParcel.parcel.municipality_name,
address: selectedParcel.parcel.address,
area_m2: selectedParcel.parcel.area_m2,
centroid: selectedParcel.parcel.centroid,
geoportal_url: selectedParcel.parcel.geoportal_url,
realestate_type: selectedParcel.parcel.realestate_type,
geometry_geojson: selectedParcel.map_view?.geometry_geojson,
perimeter: selectedParcel.parcel.perimeter
}
};
const addResponse = await api.post(
`/api/realestate/projekt/${projektResult.id}/add-parcel`,
addParcelRequestBody
);
const addResult: AddParcelResponse = addResponse.data;
// Update current projekt with the updated version that includes the parcel
setCurrentProjekt(addResult.projekt);
// Update the assistant message to indicate parcel was added
const updateMessage = {
...assistantMessage,
id: `assistant-update-${Date.now()}`,
message: `${responseMessage}\n\n✅ Parzelle wurde automatisch zum Projekt hinzugefügt.`
};
setCommandResults((prev) => {
const updated = [...prev];
const lastIndex = updated.length - 1;
if (updated[lastIndex]?.id === assistantMessage.id) {
updated[lastIndex] = updateMessage;
}
return updated;
});
}
} catch (addError: any) {
// Log error but don't fail the command
console.error('Failed to automatically add parcel to project:', addError);
const errorMessage = addError.response?.data?.detail || addError.message || 'Unbekannter Fehler';
const errorUpdate = {
id: `assistant-error-${Date.now()}`,
role: 'assistant',
message: `⚠️ Projekt wurde erstellt, aber Parzelle konnte nicht automatisch hinzugefügt werden: ${errorMessage}`,
timestamp: Date.now()
};
setCommandResults((prev) => [...prev, errorUpdate]);
}
}
// If a parcel was created and there's a selected parcel, automatically populate it with the selected parcel data
if (data.success && data.intent === 'CREATE' && data.entity === 'Parzelle' && selectedParcel) {
try {
// Extract parzelle from result
const parzelleResult = data.result?.result || data.result;
if (parzelleResult?.id) {
// Update the newly created parcel with data from the selected parcel
const updateParcelRequestBody: any = {
// Map selected parcel data to parzelle fields
egrid: selectedParcel.parcel.egrid,
number: selectedParcel.parcel.number,
name: selectedParcel.parcel.name,
identnd: selectedParcel.parcel.identnd,
canton: selectedParcel.parcel.canton,
municipality_code: selectedParcel.parcel.municipality_code,
municipality_name: selectedParcel.parcel.municipality_name,
address: selectedParcel.parcel.address,
strasseNr: selectedParcel.parcel.address,
area_m2: selectedParcel.parcel.area_m2,
centroid: selectedParcel.parcel.centroid,
geoportal_url: selectedParcel.parcel.geoportal_url,
realestate_type: selectedParcel.parcel.realestate_type,
// Include geometry data
geometry_geojson: selectedParcel.map_view?.geometry_geojson,
perimeter: selectedParcel.parcel.perimeter
};
// Try to update the parcel via PUT request
try {
await api.put(
`/api/realestate/parzelle/${parzelleResult.id}`,
updateParcelRequestBody
);
// Update the assistant message to indicate parcel was populated
const updateMessage = {
...assistantMessage,
id: `assistant-update-${Date.now()}`,
message: `${responseMessage}\n\n✅ Parzelle wurde automatisch mit Daten der Kartenauswahl befüllt.`
};
setCommandResults((prev) => {
const updated = [...prev];
const lastIndex = updated.length - 1;
if (updated[lastIndex]?.id === assistantMessage.id) {
updated[lastIndex] = updateMessage;
}
return updated;
});
} catch (putError: any) {
// If PUT doesn't work, try PATCH
try {
await api.patch(
`/api/realestate/parzelle/${parzelleResult.id}`,
updateParcelRequestBody
);
const updateMessage = {
...assistantMessage,
id: `assistant-update-${Date.now()}`,
message: `${responseMessage}\n\n✅ Parzelle wurde automatisch mit Daten der Kartenauswahl befüllt.`
};
setCommandResults((prev) => {
const updated = [...prev];
const lastIndex = updated.length - 1;
if (updated[lastIndex]?.id === assistantMessage.id) {
updated[lastIndex] = updateMessage;
}
return updated;
});
} catch (patchError: any) {
// If both PUT and PATCH fail, log but don't fail the command
console.error('Failed to update parcel with selected parcel data:', patchError);
const errorMessage = patchError.response?.data?.detail || patchError.message || 'Unbekannter Fehler';
const errorUpdate = {
id: `assistant-error-${Date.now()}`,
role: 'assistant',
message: `⚠️ Parzelle wurde erstellt, aber konnte nicht automatisch mit Kartenauswahl-Daten befüllt werden: ${errorMessage}`,
timestamp: Date.now()
};
setCommandResults((prev) => [...prev, errorUpdate]);
}
}
}
} catch (updateError: any) {
// Log error but don't fail the command
console.error('Failed to automatically populate parcel with selected parcel data:', updateError);
const errorMessage = updateError.response?.data?.detail || updateError.message || 'Unbekannter Fehler';
const errorUpdate = {
id: `assistant-error-${Date.now()}`,
role: 'assistant',
message: `⚠️ Parzelle wurde erstellt, aber konnte nicht automatisch mit Kartenauswahl-Daten befüllt werden: ${errorMessage}`,
timestamp: Date.now()
};
setCommandResults((prev) => [...prev, errorUpdate]);
}
}
// Clear input on success
setCommandInput('');
@ -515,7 +747,7 @@ export function usePek() {
} finally {
setIsProcessingCommand(false);
}
}, []);
}, [selectedParcel]);
/**
* Create a new project

View file

@ -90,10 +90,36 @@ export const usePermissions = () => {
try {
// Use retry logic for 429 errors
// Note: We wrap the API call in retry logic since useApiRequest doesn't handle 429 retries
console.log('🔐 usePermissions: Checking permissions for:', { context, item, cacheKey: key });
const permissions = await retryWithBackoff(async () => {
try {
return await fetchPermissionsApi(request, context, item);
const result = await fetchPermissionsApi(request, context, item);
console.log('✅ usePermissions: Received permissions response:', {
context,
item,
permissions: result,
view: result?.view,
viewType: typeof result?.view,
viewValue: result?.view,
read: result?.read,
create: result?.create,
update: result?.update,
delete: result?.delete,
isArray: Array.isArray(result),
keys: result ? Object.keys(result) : [],
fullResponse: JSON.stringify(result, null, 2)
});
return result;
} catch (error: any) {
console.error('❌ usePermissions: Error fetching permissions:', {
context,
item,
error: error.message,
status: error.response?.status,
statusText: error.response?.statusText,
fullError: error
});
// If useApiRequest throws, we need to check if it's a 429
// For now, we'll let the retry logic handle it
throw error;
@ -104,6 +130,7 @@ export const usePermissions = () => {
setCache(prev => {
const newCache = { ...prev, [key]: permissions };
cacheRef.current = newCache;
console.log('💾 usePermissions: Cached permissions:', { context, item, permissions });
return newCache;
});
@ -170,8 +197,26 @@ export const usePermissions = () => {
context: PermissionContext,
item: string
): Promise<boolean> => {
console.log('👁️ canView: Checking view access for:', { context, item });
const permissions = await checkPermission(context, item);
return permissions.view;
const hasAccess = permissions.view === true;
console.log('👁️ canView: Result:', {
context,
item,
hasAccess,
viewPermission: permissions.view,
viewPermissionType: typeof permissions.view,
viewPermissionValue: permissions.view,
allPermissions: {
view: permissions.view,
read: permissions.read,
create: permissions.create,
update: permissions.update,
delete: permissions.delete
},
fullPermissionsObject: JSON.stringify(permissions, null, 2)
});
return hasAccess;
}, [checkPermission]);
/**

View file

@ -223,13 +223,8 @@ export function usePrompts() {
fieldType = 'string';
}
}
// Legacy support for old format
else if (attr.type === 'boolean') {
fieldType = 'boolean';
} else if (attr.type === 'enum' && attr.filterOptions) {
fieldType = 'enum';
options = attr.filterOptions.map(opt => ({ value: opt, label: opt }));
}
// Note: Legacy 'boolean' and 'enum' types are not in the AttributeDefinition type union
// If needed, they should be handled via type casting: (attr as any).type === 'boolean'
// Define validators and required fields
let required = attr.required === true;
@ -444,7 +439,7 @@ export function usePromptOperations() {
}
};
const handlePromptUpdate = async (promptId: string, updateData: { name: string; content: string }, originalData?: any) => {
const handlePromptUpdate = async (promptId: string, updateData: { name: string; content: string }, _originalData?: any) => {
setUpdateError(null);
try {

View file

@ -57,7 +57,7 @@ export function createSettingsHook(): () => GenericDataHook {
const currentUserIdRef = useRef<string | undefined>(currentUser?.id);
// Load phone name from localStorage
const loadPhoneName = useCallback((): string => {
const _loadPhoneName = useCallback((): string => {
try {
return localStorage.getItem('userPhoneName') || '';
} catch (error) {
@ -65,9 +65,10 @@ export function createSettingsHook(): () => GenericDataHook {
return '';
}
}, []);
void _loadPhoneName; // Intentionally unused, reserved for future use
// Load theme from localStorage
const loadTheme = useCallback((): string => {
const _loadTheme = useCallback((): string => {
try {
const savedTheme = localStorage.getItem('theme');
if (savedTheme) {
@ -80,9 +81,10 @@ export function createSettingsHook(): () => GenericDataHook {
return 'light';
}
}, []);
void _loadTheme; // Intentionally unused, reserved for future use
// Load speech data from localStorage
const loadSpeechData = useCallback((): any | null => {
const _loadSpeechData = useCallback((): any | null => {
try {
const savedData = localStorage.getItem('speechSignUpData');
const timestamp = localStorage.getItem('speechSignUpTimestamp');
@ -109,9 +111,10 @@ export function createSettingsHook(): () => GenericDataHook {
return null;
}
}, []);
void _loadSpeechData; // Intentionally unused, reserved for future use
// Fetch user data from API
const fetchUserData = useCallback(async () => {
const _fetchUserData = useCallback(async () => {
if (!currentUser?.id) return null;
try {
@ -122,9 +125,10 @@ export function createSettingsHook(): () => GenericDataHook {
throw error;
}
}, [currentUser?.id, getUser]);
void _fetchUserData; // Intentionally unused, reserved for future use
// Fetch field definitions from backend
const fetchFieldsForSection = useCallback(async (sectionId: string): Promise<SettingsFieldConfig[]> => {
const _fetchFieldsForSection = useCallback(async (sectionId: string): Promise<SettingsFieldConfig[]> => {
try {
setSettingsLoading(prev => ({ ...prev, [sectionId]: true }));
setSettingsErrors(prev => ({ ...prev, [sectionId]: null }));
@ -148,6 +152,7 @@ export function createSettingsHook(): () => GenericDataHook {
setSettingsLoading(prev => ({ ...prev, [sectionId]: false }));
}
}, [request]);
void _fetchFieldsForSection; // Intentionally unused, reserved for future use
// Load all settings data
const loadSettingsData = useCallback(async () => {

View file

@ -29,13 +29,29 @@ export function useCurrentUser() {
try {
// Check if we already have user data in sessionStorage cache
const cachedUser = getUserDataCache();
if (cachedUser) {
setUser(cachedUser);
console.log('✅ Using cached user data from sessionStorage (persists during session):', {
username: cachedUser.username,
privilege: cachedUser.privilege
});
return;
if (cachedUser && cachedUser.username) {
// Check if cached user has roleLabels - if empty, refetch from API
const hasRoleLabels = Array.isArray(cachedUser.roleLabels) && cachedUser.roleLabels.length > 0;
const hasPrivilege = !!cachedUser.privilege;
if (!hasRoleLabels && !hasPrivilege) {
console.warn('⚠️ Cached user data has no roleLabels or privilege, refetching from API:', {
username: cachedUser.username,
roleLabels: cachedUser.roleLabels,
privilege: cachedUser.privilege
});
// Clear cache and continue to fetch from API
clearUserDataCache();
} else {
// Use cached user data - permissions are checked via RBAC API, not client-side
setUser(cachedUser);
console.log('✅ Using cached user data from sessionStorage (persists during session):', {
username: cachedUser.username,
roleLabels: cachedUser.roleLabels,
privilege: cachedUser.privilege
});
return;
}
}
// JWT tokens are now stored in httpOnly cookies, so we fetch user data from API
@ -64,13 +80,58 @@ export function useCurrentUser() {
}
const data = await fetchCurrentUserApi(request, authAuthority || undefined);
setUser(data);
// Cache user data in sessionStorage (cleared on tab close - more secure than localStorage)
// Log full response for debugging
console.log('📦 User data received from API:', {
username: data?.username,
roleLabels: data?.roleLabels,
privilege: data?.privilege,
hasRoleLabels: !!data?.roleLabels,
roleLabelsLength: Array.isArray(data?.roleLabels) ? data.roleLabels.length : 0,
roleLabelsContent: Array.isArray(data?.roleLabels) ? data.roleLabels : 'not an array',
hasPrivilege: !!data?.privilege,
allKeys: data ? Object.keys(data) : [],
fullData: JSON.stringify(data, null, 2)
});
// Always cache user data - permissions are checked via RBAC API, not client-side
// roleLabels/privilege are optional metadata for display/logging purposes
if (!data || !data.username) {
console.error('❌ User data from API is invalid:', {
username: data?.username,
dataKeys: data ? Object.keys(data) : [],
fullResponse: data
});
throw new Error('Invalid user data received from API');
}
// Check if API returned roleLabels - if not, log warning but still cache
const hasRoleLabels = Array.isArray(data.roleLabels) && data.roleLabels.length > 0;
const hasPrivilege = !!data.privilege;
if (!hasRoleLabels && !hasPrivilege) {
console.warn('⚠️ User data from API has no roleLabels or privilege - this may cause RBAC issues:', {
username: data.username,
roleLabels: data.roleLabels,
privilege: data.privilege,
allKeys: Object.keys(data),
fullResponse: JSON.stringify(data, null, 2)
});
// Still cache it, but log the issue - backend RBAC should handle permissions
// However, if backend expects roleLabels, this will cause problems
}
// Cache user data (permissions are checked via RBAC API)
setUserDataCache(data);
console.log('✅ User data fetched from API and cached in sessionStorage (secure):', {
username: data.username,
privilege: data.privilege
roleLabels: data.roleLabels,
roleLabelsLength: Array.isArray(data.roleLabels) ? data.roleLabels.length : 0,
privilege: data.privilege,
hasRoleLabels,
hasPrivilege
});
setUser(data);
} catch (error: any) {
console.error('❌ Failed to fetch user data:', error);
@ -125,14 +186,8 @@ export function useCurrentUser() {
}
try {
let logoutEndpoint = '/api/local/logout';
// Determine the correct logout endpoint based on authentication authority
if (user.authenticationAuthority === 'msft') {
logoutEndpoint = '/api/msft/logout';
} else if (user.authenticationAuthority === 'local') {
logoutEndpoint = '/api/local/logout';
}
// Note: logoutEndpoint is determined by logoutUserApi based on authenticationAuthority
await logoutUserApi(request, user.authenticationAuthority);
@ -244,9 +299,30 @@ export function useCurrentUser() {
useEffect(() => {
// Try to load user from sessionStorage cache first for faster initial load
const cachedUser = getUserDataCache();
if (cachedUser) {
if (cachedUser && cachedUser.username) {
// Check if cached user has roleLabels - if empty, refetch from API
const hasRoleLabels = Array.isArray(cachedUser.roleLabels) && cachedUser.roleLabels.length > 0;
const hasPrivilege = !!cachedUser.privilege;
if (!hasRoleLabels && !hasPrivilege) {
console.warn('⚠️ Cached user data has no roleLabels or privilege, refetching from API:', {
username: cachedUser.username,
roleLabels: cachedUser.roleLabels,
privilege: cachedUser.privilege
});
// Clear cache and refetch
clearUserDataCache();
fetchCurrentUser();
return;
}
// Use cached user data - permissions are checked via RBAC API
setUser(cachedUser);
console.log('✅ Using cached user data from sessionStorage on mount (persists during session)');
console.log('✅ Using cached user data from sessionStorage on mount (persists during session):', {
username: cachedUser.username,
roleLabels: cachedUser.roleLabels,
privilege: cachedUser.privilege
});
}
// For OAuth authentication, wait a bit longer before fetching user data
@ -320,7 +396,13 @@ export function useOrgUsers() {
setAttributes(attrs);
return attrs;
} catch (error: any) {
console.error('Error fetching attributes:', error);
// Don't log 429 errors as errors (they're rate limit warnings)
if (error.response?.status === 429) {
console.warn('Rate limit exceeded while fetching user attributes. Please wait.');
} else if (error.response?.status !== 401) {
// Only log non-auth errors (401 is expected when not logged in)
console.error('Error fetching attributes:', error);
}
setAttributes([]);
return [];
}
@ -498,13 +580,8 @@ export function useOrgUsers() {
fieldType = 'string';
}
}
// Legacy support for old format
else if (attr.type === 'boolean') {
fieldType = 'boolean';
} else if (attr.type === 'enum' && attr.filterOptions) {
fieldType = 'enum';
options = attr.filterOptions.map(opt => ({ value: opt, label: opt }));
}
// Note: Legacy 'boolean' and 'enum' types are not in the AttributeDefinition type union
// If needed, they should be handled via type casting: (attr as any).type === 'boolean'
// Define validators and required fields
let required = attr.required === true;
@ -547,7 +624,7 @@ export function useOrgUsers() {
key: attr.name,
label: attr.label || attr.name,
type: fieldType,
editable: attr.editable !== false && attr.readonly !== true,
editable: (attr as any).editable !== false && (attr as any).readonly !== true,
required,
validator,
minRows,
@ -562,6 +639,12 @@ export function useOrgUsers() {
// Ensure attributes are loaded - can be called by EditActionButton
const ensureAttributesLoaded = useCallback(async () => {
// Don't fetch attributes if user is not authenticated (prevents 401 errors)
const currentUser = getUserDataCache();
if (!currentUser) {
return [];
}
if (attributes && attributes.length > 0) {
return attributes;
}
@ -570,10 +653,13 @@ export function useOrgUsers() {
return fetchedAttributes;
}, [attributes, fetchAttributes]);
// Fetch attributes and permissions on mount
// Fetch attributes and permissions on mount (only if user is authenticated)
useEffect(() => {
fetchAttributes();
fetchPermissions();
const currentUser = getUserDataCache();
if (currentUser) {
fetchAttributes();
fetchPermissions();
}
}, [fetchAttributes, fetchPermissions]);
// Initial fetch
@ -652,7 +738,7 @@ export function useUserOperations() {
}
};
const handleUserUpdate = async (userId: string, updateData: UserUpdateData, originalData?: any) => {
const handleUserUpdate = async (userId: string, updateData: UserUpdateData, _originalData?: any) => {
setUpdateError(null);
setEditingUsers(prev => new Set(prev).add(userId));

View file

@ -19,7 +19,7 @@ import { MessageOverlay } from '../components/UiComponents';
import type { MessageMode } from '../components/UiComponents';
import { useLanguage } from '../providers/language/LanguageContext';
import { useWorkflowSelection } from '../contexts/WorkflowSelectionContext';
import { getUserDataCache } from '../utils/userCache';
// import { getUserDataCache } from '../utils/userCache'; // Unused import
import { usePermissions, type UserPermissions } from './usePermissions';
// Workflow interface matching backend
@ -279,13 +279,8 @@ export function useUserWorkflows() {
fieldType = 'string';
}
}
// Legacy support for old format
else if (attr.type === 'boolean') {
fieldType = 'boolean';
} else if (attr.type === 'enum' && attr.filterOptions) {
fieldType = 'enum';
options = attr.filterOptions.map(opt => ({ value: opt, label: opt }));
}
// Note: Legacy 'boolean' and 'enum' types are not in the AttributeDefinition type union
// If needed, they should be handled via type casting: (attr as any).type === 'boolean'
// Define validators and required fields
let required = attr.required === true;
@ -360,7 +355,7 @@ export function useUserWorkflows() {
// Listen for workflow creation events to refetch workflows list
useEffect(() => {
const handleWorkflowCreated = (event: CustomEvent<{ workflow: UserWorkflow }>) => {
const handleWorkflowCreated = (_event: CustomEvent<{ workflow: UserWorkflow }>) => {
// Refetch to ensure we have the latest data
fetchWorkflowsData();
};
@ -409,7 +404,7 @@ export function useWorkflowOperations() {
const [warningData, setWarningData] = useState<{ header: string; message: string; mode: MessageMode } | null>(null);
// Language context
const { t } = useLanguage();
const { t: _t } = useLanguage();
// Workflow selection context - to clear selection if deleted workflow is selected
const { selectedWorkflowId, clearWorkflow } = useWorkflowSelection();
@ -594,7 +589,7 @@ export function useWorkflowOperations() {
);
};
const handleWorkflowUpdate = async (workflowId: string, updateData: Partial<{ name: string; description?: string; tags?: string[] }>, originalWorkflowData?: any) => {
const handleWorkflowUpdate = async (workflowId: string, updateData: Partial<{ name: string; description?: string; tags?: string[] }>, _originalWorkflowData?: any) => {
setUpdateError(null);
setEditingWorkflows(prev => new Set(prev).add(workflowId));

View file

@ -261,7 +261,7 @@
grid-column: 1 / -1;
grid-row: 1;
display: grid;
grid-template-columns: 1fr 400px;
grid-template-columns: 2fr 1fr;
gap: 1rem;
min-height: 0;
overflow: hidden;

View file

@ -0,0 +1,181 @@
/**
* Utility functions for mapping attribute types to HTML input types and component types
*/
export type AttributeType =
| 'text'
| 'textarea'
| 'select'
| 'multiselect'
| 'integer'
| 'float'
| 'number'
| 'timestamp'
| 'date'
| 'time'
| 'checkbox'
| 'boolean'
| 'email'
| 'url'
| 'password'
| 'file'
| 'string'
| 'enum'
| 'readonly';
export type InputComponentType =
| 'text'
| 'textarea'
| 'select'
| 'multiselect'
| 'checkbox'
| 'file'
| 'email'
| 'url'
| 'password'
| 'date'
| 'time'
| 'datetime-local'
| 'number';
/**
* Maps attribute type to HTML input type
*
* @param attributeType - The attribute type from the backend
* @returns The corresponding HTML input type
*
* Mapping rules:
* - text text (single line)
* - textarea textarea (multi-line)
* - select select (dropdown with options)
* - multiselect multiselect (multiple selection)
* - integer number (integer only)
* - float or number number (decimal allowed)
* - timestamp datetime-local (date/time picker)
* - date date (date picker, date only)
* - time time (time picker, time only)
* - checkbox or boolean checkbox (boolean)
* - email email (with email validation)
* - url url (with URL validation)
* - password password (masked)
* - file file (file upload)
*/
export function attributeTypeToInputType(attributeType: AttributeType): InputComponentType {
switch (attributeType) {
case 'text':
case 'string':
return 'text';
case 'textarea':
return 'textarea';
case 'select':
case 'enum':
return 'select';
case 'multiselect':
return 'multiselect';
case 'integer':
case 'number':
case 'float':
return 'number';
case 'timestamp':
return 'datetime-local';
case 'date':
return 'date';
case 'time':
return 'time';
case 'checkbox':
case 'boolean':
return 'checkbox';
case 'email':
return 'email';
case 'url':
return 'url';
case 'password':
return 'password';
case 'file':
return 'file';
case 'readonly':
return 'text'; // Default to text for readonly, but should be rendered as readonly
default:
// Default fallback to text input
return 'text';
}
}
/**
* Determines if an attribute type should render as a textarea
*/
export function isTextareaType(attributeType: AttributeType): boolean {
return attributeType === 'textarea';
}
/**
* Determines if an attribute type should render as a select dropdown
*/
export function isSelectType(attributeType: AttributeType): boolean {
return attributeType === 'select' || attributeType === 'enum';
}
/**
* Determines if an attribute type should render as a multiselect
*/
export function isMultiselectType(attributeType: AttributeType): boolean {
return attributeType === 'multiselect';
}
/**
* Determines if an attribute type should render as a checkbox
*/
export function isCheckboxType(attributeType: AttributeType): boolean {
return attributeType === 'checkbox' || attributeType === 'boolean';
}
/**
* Determines if an attribute type should render as a file input
*/
export function isFileType(attributeType: AttributeType): boolean {
return attributeType === 'file';
}
/**
* Determines if an attribute type should render as a number input
*/
export function isNumberType(attributeType: AttributeType): boolean {
return attributeType === 'integer' || attributeType === 'number' || attributeType === 'float';
}
/**
* Determines if an attribute type should render as a date/time input
*/
export function isDateTimeType(attributeType: AttributeType): boolean {
return attributeType === 'timestamp' || attributeType === 'date' || attributeType === 'time';
}
/**
* Gets the default value for an attribute type
*/
export function getDefaultValueForType(attributeType: AttributeType): any {
if (isCheckboxType(attributeType)) {
return false;
}
if (isMultiselectType(attributeType)) {
return [];
}
if (isNumberType(attributeType)) {
return 0;
}
return '';
}

View file

@ -1,11 +1,14 @@
import { PrivilegeChecker } from '../core/PageManager/pageInterface';
import { getUserDataCache } from './userCache';
import type { PermissionContext } from '../hooks/usePermissions';
/**
* Privilege Checkers
*
* Read-only access to user data for privilege checking.
* Does not manage user data storage - that's handled by authentication hooks.
*
* Now supports both client-side checks (roles, localStorage) and backend RBAC integration.
*/
// Function to get current user privilege from sessionStorage cache
@ -96,6 +99,123 @@ export const createCustomPrivilegeChecker = (
return checkFunction;
};
/**
* Create a privilege checker that uses backend RBAC permissions
* This integrates privilegeCheckers with usePermissions for backend-controlled access
*
* @param canViewFunction - The canView function from usePermissions hook
* @param context - Permission context ('UI', 'DATA', or 'RESOURCE')
* @param item - The item/resource path to check permissions for
* @returns A PrivilegeChecker function that checks backend RBAC permissions
*/
export const createRBACPrivilegeChecker = (
canViewFunction: (context: PermissionContext, item: string) => Promise<boolean>,
context: PermissionContext,
item: string
): PrivilegeChecker => {
return async (): Promise<boolean> => {
try {
return await canViewFunction(context, item);
} catch (error) {
console.error(`Error checking RBAC privilege for ${context}:${item}:`, error);
return false;
}
};
};
/**
* Create a privilege checker that combines RBAC with client-side role checks
* First checks backend RBAC, then falls back to client-side role check if RBAC allows
*
* @param canViewFunction - The canView function from usePermissions hook
* @param context - Permission context ('UI', 'DATA', or 'RESOURCE')
* @param item - The item/resource path to check permissions for
* @param requiredRoles - Fallback client-side roles to check if RBAC passes
* @returns A PrivilegeChecker function that checks both RBAC and roles
*/
export const createCombinedPrivilegeChecker = (
canViewFunction: (context: PermissionContext, item: string) => Promise<boolean>,
context: PermissionContext,
item: string,
requiredRoles: string[]
): PrivilegeChecker => {
return async (): Promise<boolean> => {
try {
// First check backend RBAC
const hasRBACAccess = await canViewFunction(context, item);
if (!hasRBACAccess) {
return false;
}
// If RBAC allows, also check client-side roles as additional validation
const userPrivilege = getCurrentUserPrivilege();
if (userPrivilege && requiredRoles.includes(userPrivilege)) {
return true;
}
// If no role match, still allow if RBAC said yes (backend is source of truth)
return hasRBACAccess;
} catch (error) {
console.error(`Error checking combined privilege for ${context}:${item}:`, error);
return false;
}
};
};
/**
* Helper to create RBAC-based privilege checkers for page data
* These checkers will use backend RBAC permissions via usePermissions
*
* Usage in page data:
* import { createRBACPageChecker } from '@/utils/privilegeCheckers';
*
* // In PageManager, initialize with canView function:
* const rbacCheckers = createRBACPageCheckers(canView);
*
* // In page data:
* privilegeChecker: rbacCheckers.forPage('administration/workflows')
*/
export const createRBACPageCheckers = (
canViewFunction: (context: PermissionContext, item: string) => Promise<boolean>
) => {
return {
/**
* Create a privilege checker for a specific page path
* Checks backend RBAC permissions for UI context
*/
forPage: (pagePath: string): PrivilegeChecker => {
return createRBACPrivilegeChecker(canViewFunction, 'UI', pagePath);
},
/**
* Create a privilege checker that combines RBAC with role requirements
* First checks backend RBAC, then validates user role
*/
forPageWithRole: (
pagePath: string,
requiredRoles: string[]
): PrivilegeChecker => {
return createCombinedPrivilegeChecker(canViewFunction, 'UI', pagePath, requiredRoles);
},
/**
* Create a privilege checker for a data resource
* Checks backend RBAC permissions for DATA context
*/
forData: (resourcePath: string): PrivilegeChecker => {
return createRBACPrivilegeChecker(canViewFunction, 'DATA', resourcePath);
},
/**
* Create a privilege checker for a UI resource
* Checks backend RBAC permissions for UI context
*/
forUI: (resourcePath: string): PrivilegeChecker => {
return createRBACPrivilegeChecker(canViewFunction, 'UI', resourcePath);
}
};
};
// Predefined privilege checkers for common use cases
export const privilegeCheckers = {
// Speech signup checker (existing functionality)

View file

@ -17,7 +17,8 @@ export interface CachedUserData {
username: string;
email: string;
fullName: string;
privilege: string;
privilege?: string; // Deprecated - use roleLabels instead
roleLabels?: string[]; // Array of role labels from backend (e.g., ["user"])
mandateId: string;
language: string;
enabled: boolean;
@ -30,6 +31,8 @@ export interface CachedUserData {
*/
export const setUserDataCache = (userData: CachedUserData): void => {
if (userData) {
// Always cache user data - permissions are checked via RBAC API, not client-side
// roleLabels/privilege are optional metadata, not required for app functionality
try {
sessionStorage.setItem(USER_CACHE_KEY, JSON.stringify(userData));
} catch (error) {