mirror of
https://github.com/Frooodle/Stirling-PDF.git
synced 2025-09-03 17:52:30 +02:00
feat: Implement shared hooks for tool operations
- Introduced `useToolApiCalls` for handling API calls with file processing and cancellation support. - Created `useToolOperation` to manage tool operations, including state management, error handling, and file processing. - Added `useToolResources` for managing blob URLs and generating thumbnails. - Developed `useToolState` for centralized state management of tool operations. - Refactored `useSplitOperation` to utilize the new shared hooks, simplifying the execution of split operations. - Updated `useSplitParameters` to remove mode state and integrate with the new parameter structure. - Enhanced error handling with `toolErrorHandler` utilities for standardized error extraction and messaging. - Implemented `toolOperationTracker` for creating operation tracking data for file context integration. - Added `toolResponseProcessor` for processing API response blobs based on handler configuration.
This commit is contained in:
parent
24a9104ebf
commit
dcadada7d3
60
CLAUDE.md
60
CLAUDE.md
@ -59,12 +59,53 @@ Frontend designed for **stateful document processing**:
|
|||||||
Without cleanup: browser crashes with memory leaks.
|
Without cleanup: browser crashes with memory leaks.
|
||||||
|
|
||||||
#### Tool Development
|
#### Tool Development
|
||||||
- **Pattern**: Follow `src/tools/Split.tsx` as reference implementation
|
|
||||||
- **File Access**: Tools receive `selectedFiles` prop (computed from activeFiles based on user selection)
|
**Architecture**: Modular hook-based system with clear separation of concerns:
|
||||||
- **File Selection**: Users select files in FileEditor (tool mode) → stored as IDs → computed to File objects for tools
|
|
||||||
- **Integration**: All files are part of FileContext ecosystem - automatic memory management and operation tracking
|
- **useToolOperation** (`frontend/src/hooks/tools/shared/useToolOperation.ts`): Main orchestrator hook
|
||||||
- **Parameters**: Tool parameter handling patterns still being standardized
|
- Coordinates all tool operations with consistent interface
|
||||||
- **Preview Integration**: Tools can implement preview functionality (see Split tool's thumbnail preview)
|
- Integrates with FileContext for operation tracking
|
||||||
|
- Handles validation, error handling, and UI state management
|
||||||
|
|
||||||
|
- **Supporting Hooks**:
|
||||||
|
- **useToolState**: UI state management (loading, progress, error, files)
|
||||||
|
- **useToolApiCalls**: HTTP requests and file processing
|
||||||
|
- **useToolResources**: Blob URLs, thumbnails, ZIP downloads
|
||||||
|
|
||||||
|
- **Utilities**:
|
||||||
|
- **toolErrorHandler**: Standardized error extraction and i18n support
|
||||||
|
- **toolResponseProcessor**: API response handling (single/zip/custom)
|
||||||
|
- **toolOperationTracker**: FileContext integration utilities
|
||||||
|
|
||||||
|
**Tool Implementation Pattern**:
|
||||||
|
1. Create hook in `frontend/src/hooks/tools/[toolname]/use[ToolName]Operation.ts`
|
||||||
|
2. Define parameters interface and validation
|
||||||
|
3. Implement `buildFormData` function for API requests
|
||||||
|
4. Configure `useToolOperation` with endpoints and settings
|
||||||
|
5. UI components consume the hook's state and actions
|
||||||
|
|
||||||
|
**Example Pattern** (see `useCompressOperation.ts`):
|
||||||
|
```typescript
|
||||||
|
export const useCompressOperation = () => {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return useToolOperation<CompressParameters>({
|
||||||
|
operationType: 'compress',
|
||||||
|
endpoint: '/api/v1/misc/compress-pdf',
|
||||||
|
buildFormData,
|
||||||
|
filePrefix: 'compressed_',
|
||||||
|
validateParams: (params) => { /* validation logic */ },
|
||||||
|
getErrorMessage: createStandardErrorHandler(t('compress.error.failed'))
|
||||||
|
});
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
**Benefits**:
|
||||||
|
- **Consistent**: All tools follow same pattern and interface
|
||||||
|
- **Maintainable**: Single responsibility hooks, easy to test and modify
|
||||||
|
- **i18n Ready**: Built-in internationalization support
|
||||||
|
- **Type Safe**: Full TypeScript support with generic interfaces
|
||||||
|
- **Memory Safe**: Automatic resource cleanup and blob URL management
|
||||||
|
|
||||||
## Architecture Overview
|
## Architecture Overview
|
||||||
|
|
||||||
@ -126,7 +167,10 @@ Without cleanup: browser crashes with memory leaks.
|
|||||||
- **Core Status**: React SPA architecture complete with multi-tool workflow support
|
- **Core Status**: React SPA architecture complete with multi-tool workflow support
|
||||||
- **State Management**: FileContext handles all file operations and tool navigation
|
- **State Management**: FileContext handles all file operations and tool navigation
|
||||||
- **File Processing**: Production-ready with memory management for large PDF workflows (up to 100GB+)
|
- **File Processing**: Production-ready with memory management for large PDF workflows (up to 100GB+)
|
||||||
- **Tool Integration**: Standardized tool interface - see `src/tools/Split.tsx` as reference
|
- **Tool Integration**: Modular hook architecture with `useToolOperation` orchestrator
|
||||||
|
- Individual hooks: `useToolState`, `useToolApiCalls`, `useToolResources`
|
||||||
|
- Utilities: `toolErrorHandler`, `toolResponseProcessor`, `toolOperationTracker`
|
||||||
|
- Pattern: Each tool creates focused operation hook, UI consumes state/actions
|
||||||
- **Preview System**: Tool results can be previewed without polluting file context (Split tool example)
|
- **Preview System**: Tool results can be previewed without polluting file context (Split tool example)
|
||||||
- **Performance**: Web Worker thumbnails, IndexedDB persistence, background processing
|
- **Performance**: Web Worker thumbnails, IndexedDB persistence, background processing
|
||||||
|
|
||||||
@ -141,7 +185,7 @@ Without cleanup: browser crashes with memory leaks.
|
|||||||
- **Security**: When `DOCKER_ENABLE_SECURITY=false`, security-related classes are excluded from compilation
|
- **Security**: When `DOCKER_ENABLE_SECURITY=false`, security-related classes are excluded from compilation
|
||||||
- **FileContext**: All file operations MUST go through FileContext - never bypass with direct File handling
|
- **FileContext**: All file operations MUST go through FileContext - never bypass with direct File handling
|
||||||
- **Memory Management**: Manual cleanup required for PDF.js documents and blob URLs - don't remove cleanup code
|
- **Memory Management**: Manual cleanup required for PDF.js documents and blob URLs - don't remove cleanup code
|
||||||
- **Tool Development**: New tools should follow Split tool pattern (`src/tools/Split.tsx`)
|
- **Tool Development**: New tools should follow `useToolOperation` hook pattern (see `useCompressOperation.ts`)
|
||||||
- **Performance Target**: Must handle PDFs up to 100GB+ without browser crashes
|
- **Performance Target**: Must handle PDFs up to 100GB+ without browser crashes
|
||||||
- **Preview System**: Tools can preview results without polluting main file context (see Split tool implementation)
|
- **Preview System**: Tools can preview results without polluting main file context (see Split tool implementation)
|
||||||
|
|
||||||
|
@ -3,6 +3,7 @@ import { useTranslation } from 'react-i18next';
|
|||||||
import { SPLIT_MODES, SPLIT_TYPES, type SplitMode, type SplitType } from '../../../constants/splitConstants';
|
import { SPLIT_MODES, SPLIT_TYPES, type SplitMode, type SplitType } from '../../../constants/splitConstants';
|
||||||
|
|
||||||
export interface SplitParameters {
|
export interface SplitParameters {
|
||||||
|
mode: SplitMode | '';
|
||||||
pages: string;
|
pages: string;
|
||||||
hDiv: string;
|
hDiv: string;
|
||||||
vDiv: string;
|
vDiv: string;
|
||||||
@ -15,16 +16,12 @@ export interface SplitParameters {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export interface SplitSettingsProps {
|
export interface SplitSettingsProps {
|
||||||
mode: SplitMode | '';
|
|
||||||
onModeChange: (mode: SplitMode | '') => void;
|
|
||||||
parameters: SplitParameters;
|
parameters: SplitParameters;
|
||||||
onParameterChange: (parameter: keyof SplitParameters, value: string | boolean) => void;
|
onParameterChange: (parameter: keyof SplitParameters, value: string | boolean) => void;
|
||||||
disabled?: boolean;
|
disabled?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
const SplitSettings = ({
|
const SplitSettings = ({
|
||||||
mode,
|
|
||||||
onModeChange,
|
|
||||||
parameters,
|
parameters,
|
||||||
onParameterChange,
|
onParameterChange,
|
||||||
disabled = false
|
disabled = false
|
||||||
@ -125,8 +122,8 @@ const SplitSettings = ({
|
|||||||
<Select
|
<Select
|
||||||
label="Choose split method"
|
label="Choose split method"
|
||||||
placeholder="Select how to split the PDF"
|
placeholder="Select how to split the PDF"
|
||||||
value={mode}
|
value={parameters.mode}
|
||||||
onChange={(v) => v && onModeChange(v)}
|
onChange={(v) => v && onParameterChange('mode', v)}
|
||||||
disabled={disabled}
|
disabled={disabled}
|
||||||
data={[
|
data={[
|
||||||
{ value: SPLIT_MODES.BY_PAGES, label: t("split.header", "Split by Pages") + " (e.g. 1,3,5-10)" },
|
{ value: SPLIT_MODES.BY_PAGES, label: t("split.header", "Split by Pages") + " (e.g. 1,3,5-10)" },
|
||||||
@ -137,10 +134,10 @@ const SplitSettings = ({
|
|||||||
/>
|
/>
|
||||||
|
|
||||||
{/* Parameter Form */}
|
{/* Parameter Form */}
|
||||||
{mode === SPLIT_MODES.BY_PAGES && renderByPagesForm()}
|
{parameters.mode === SPLIT_MODES.BY_PAGES && renderByPagesForm()}
|
||||||
{mode === SPLIT_MODES.BY_SECTIONS && renderBySectionsForm()}
|
{parameters.mode === SPLIT_MODES.BY_SECTIONS && renderBySectionsForm()}
|
||||||
{mode === SPLIT_MODES.BY_SIZE_OR_COUNT && renderBySizeOrCountForm()}
|
{parameters.mode === SPLIT_MODES.BY_SIZE_OR_COUNT && renderBySizeOrCountForm()}
|
||||||
{mode === SPLIT_MODES.BY_CHAPTERS && renderByChaptersForm()}
|
{parameters.mode === SPLIT_MODES.BY_CHAPTERS && renderByChaptersForm()}
|
||||||
</Stack>
|
</Stack>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
@ -1,10 +1,6 @@
|
|||||||
import { useCallback, useState } from 'react';
|
|
||||||
import axios from 'axios';
|
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import { useFileContext } from '../../../contexts/FileContext';
|
import { useToolOperation, ToolOperationConfig } from '../shared/useToolOperation';
|
||||||
import { FileOperation } from '../../../types/fileContext';
|
import { createStandardErrorHandler } from '../../../utils/toolErrorHandler';
|
||||||
import { zipFileService } from '../../../services/zipFileService';
|
|
||||||
import { generateThumbnailForFile } from '../../../utils/thumbnailUtils';
|
|
||||||
|
|
||||||
export interface CompressParameters {
|
export interface CompressParameters {
|
||||||
compressionLevel: number;
|
compressionLevel: number;
|
||||||
@ -15,254 +11,40 @@ export interface CompressParameters {
|
|||||||
fileSizeUnit: 'KB' | 'MB';
|
fileSizeUnit: 'KB' | 'MB';
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface CompressOperationHook {
|
const buildFormData = (parameters: CompressParameters, file: File): FormData => {
|
||||||
executeOperation: (
|
const formData = new FormData();
|
||||||
parameters: CompressParameters,
|
formData.append("fileInput", file);
|
||||||
selectedFiles: File[]
|
|
||||||
) => Promise<void>;
|
|
||||||
|
|
||||||
// Flattened result properties for cleaner access
|
if (parameters.compressionMethod === 'quality') {
|
||||||
files: File[];
|
formData.append("optimizeLevel", parameters.compressionLevel.toString());
|
||||||
thumbnails: string[];
|
} else {
|
||||||
isGeneratingThumbnails: boolean;
|
// File size method
|
||||||
downloadUrl: string | null;
|
const fileSize = parameters.fileSizeValue ? `${parameters.fileSizeValue}${parameters.fileSizeUnit}` : '';
|
||||||
downloadFilename: string;
|
if (fileSize) {
|
||||||
status: string;
|
formData.append("expectedOutputSize", fileSize);
|
||||||
errorMessage: string | null;
|
|
||||||
isLoading: boolean;
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults: () => void;
|
|
||||||
clearError: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
export const useCompressOperation = (): CompressOperationHook => {
|
|
||||||
const { t } = useTranslation();
|
|
||||||
const {
|
|
||||||
recordOperation,
|
|
||||||
markOperationApplied,
|
|
||||||
markOperationFailed,
|
|
||||||
addFiles
|
|
||||||
} = useFileContext();
|
|
||||||
|
|
||||||
// Internal state management
|
|
||||||
const [files, setFiles] = useState<File[]>([]);
|
|
||||||
const [thumbnails, setThumbnails] = useState<string[]>([]);
|
|
||||||
const [isGeneratingThumbnails, setIsGeneratingThumbnails] = useState(false);
|
|
||||||
const [downloadUrl, setDownloadUrl] = useState<string | null>(null);
|
|
||||||
const [downloadFilename, setDownloadFilename] = useState<string>('');
|
|
||||||
const [status, setStatus] = useState('');
|
|
||||||
const [errorMessage, setErrorMessage] = useState<string | null>(null);
|
|
||||||
const [isLoading, setIsLoading] = useState(false);
|
|
||||||
|
|
||||||
// Track blob URLs for cleanup
|
|
||||||
const [blobUrls, setBlobUrls] = useState<string[]>([]);
|
|
||||||
|
|
||||||
const cleanupBlobUrls = useCallback(() => {
|
|
||||||
blobUrls.forEach(url => {
|
|
||||||
try {
|
|
||||||
URL.revokeObjectURL(url);
|
|
||||||
} catch (error) {
|
|
||||||
console.warn('Failed to revoke blob URL:', error);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
setBlobUrls([]);
|
|
||||||
}, [blobUrls]);
|
|
||||||
|
|
||||||
const buildFormData = useCallback((
|
|
||||||
parameters: CompressParameters,
|
|
||||||
file: File
|
|
||||||
) => {
|
|
||||||
const formData = new FormData();
|
|
||||||
|
|
||||||
formData.append("fileInput", file);
|
|
||||||
|
|
||||||
if (parameters.compressionMethod === 'quality') {
|
|
||||||
formData.append("optimizeLevel", parameters.compressionLevel.toString());
|
|
||||||
} else {
|
|
||||||
// File size method
|
|
||||||
const fileSize = parameters.fileSizeValue ? `${parameters.fileSizeValue}${parameters.fileSizeUnit}` : '';
|
|
||||||
if (fileSize) {
|
|
||||||
formData.append("expectedOutputSize", fileSize);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
formData.append("grayscale", parameters.grayscale.toString());
|
formData.append("grayscale", parameters.grayscale.toString());
|
||||||
|
return formData;
|
||||||
const endpoint = "/api/v1/misc/compress-pdf";
|
};
|
||||||
|
|
||||||
return { formData, endpoint };
|
export const useCompressOperation = () => {
|
||||||
}, []);
|
const { t } = useTranslation();
|
||||||
|
|
||||||
const createOperation = useCallback((
|
return useToolOperation<CompressParameters>({
|
||||||
parameters: CompressParameters,
|
operationType: 'compress',
|
||||||
selectedFiles: File[]
|
endpoint: '/api/v1/misc/compress-pdf',
|
||||||
): { operation: FileOperation; operationId: string; fileId: string } => {
|
buildFormData,
|
||||||
const operationId = `compress-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
filePrefix: 'compressed_',
|
||||||
const fileId = selectedFiles.map(f => f.name).join(',');
|
singleFileMode: false, // Process files individually
|
||||||
|
timeout: 60000, // 1 minute timeout per file
|
||||||
const operation: FileOperation = {
|
validateParams: (params) => {
|
||||||
id: operationId,
|
if (params.compressionMethod === 'filesize' && !params.fileSizeValue) {
|
||||||
type: 'compress',
|
return { valid: false, errors: [t('compress.validation.fileSizeRequired', 'File size value is required when using filesize method')] };
|
||||||
timestamp: Date.now(),
|
}
|
||||||
fileIds: selectedFiles.map(f => f.name),
|
return { valid: true };
|
||||||
status: 'pending',
|
},
|
||||||
metadata: {
|
getErrorMessage: createStandardErrorHandler(t('compress.error.failed', 'An error occurred while compressing the PDF.'))
|
||||||
originalFileNames: selectedFiles.map(f => f.name),
|
});
|
||||||
parameters: {
|
|
||||||
compressionLevel: parameters.compressionLevel,
|
|
||||||
grayscale: parameters.grayscale,
|
|
||||||
expectedSize: parameters.expectedSize,
|
|
||||||
},
|
|
||||||
totalFileSize: selectedFiles.reduce((sum, f) => sum + f.size, 0),
|
|
||||||
fileCount: selectedFiles.length
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
return { operation, operationId, fileId };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
|
|
||||||
const executeOperation = useCallback(async (
|
|
||||||
parameters: CompressParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
if (selectedFiles.length === 0) {
|
|
||||||
setStatus(t("noFileSelected"));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
const validFiles = selectedFiles.filter(file => file.size > 0);
|
|
||||||
if (validFiles.length === 0) {
|
|
||||||
setErrorMessage('No valid files to compress. All selected files are empty.');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (validFiles.length < selectedFiles.length) {
|
|
||||||
console.warn(`Skipping ${selectedFiles.length - validFiles.length} empty files`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const { operation, operationId, fileId } = createOperation(parameters, selectedFiles);
|
|
||||||
|
|
||||||
recordOperation(fileId, operation);
|
|
||||||
|
|
||||||
setStatus(t("loading"));
|
|
||||||
setIsLoading(true);
|
|
||||||
setErrorMessage(null);
|
|
||||||
setFiles([]);
|
|
||||||
setThumbnails([]);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const compressedFiles: File[] = [];
|
|
||||||
|
|
||||||
const failedFiles: string[] = [];
|
|
||||||
|
|
||||||
for (let i = 0; i < validFiles.length; i++) {
|
|
||||||
const file = validFiles[i];
|
|
||||||
setStatus(`Compressing ${file.name} (${i + 1}/${validFiles.length})`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { formData, endpoint } = buildFormData(parameters, file);
|
|
||||||
const response = await axios.post(endpoint, formData, { responseType: "blob" });
|
|
||||||
|
|
||||||
const contentType = response.headers['content-type'] || 'application/pdf';
|
|
||||||
const blob = new Blob([response.data], { type: contentType });
|
|
||||||
const compressedFile = new File([blob], `compressed_${file.name}`, { type: contentType });
|
|
||||||
|
|
||||||
compressedFiles.push(compressedFile);
|
|
||||||
} catch (fileError) {
|
|
||||||
console.error(`Failed to compress ${file.name}:`, fileError);
|
|
||||||
failedFiles.push(file.name);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (failedFiles.length > 0 && compressedFiles.length === 0) {
|
|
||||||
throw new Error(`Failed to compress all files: ${failedFiles.join(', ')}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (failedFiles.length > 0) {
|
|
||||||
setStatus(`Compressed ${compressedFiles.length}/${validFiles.length} files. Failed: ${failedFiles.join(', ')}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
setFiles(compressedFiles);
|
|
||||||
setIsGeneratingThumbnails(true);
|
|
||||||
|
|
||||||
await addFiles(compressedFiles);
|
|
||||||
|
|
||||||
cleanupBlobUrls();
|
|
||||||
|
|
||||||
if (compressedFiles.length === 1) {
|
|
||||||
const url = window.URL.createObjectURL(compressedFiles[0]);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setBlobUrls([url]);
|
|
||||||
setDownloadFilename(`compressed_${selectedFiles[0].name}`);
|
|
||||||
} else {
|
|
||||||
const { zipFile } = await zipFileService.createZipFromFiles(compressedFiles, 'compressed_files.zip');
|
|
||||||
const url = window.URL.createObjectURL(zipFile);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setBlobUrls([url]);
|
|
||||||
setDownloadFilename(`compressed_${validFiles.length}_files.zip`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const thumbnails = await Promise.all(
|
|
||||||
compressedFiles.map(async (file) => {
|
|
||||||
try {
|
|
||||||
const thumbnail = await generateThumbnailForFile(file);
|
|
||||||
return thumbnail || '';
|
|
||||||
} catch (error) {
|
|
||||||
console.warn(`Failed to generate thumbnail for ${file.name}:`, error);
|
|
||||||
return '';
|
|
||||||
}
|
|
||||||
})
|
|
||||||
);
|
|
||||||
|
|
||||||
setThumbnails(thumbnails);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
setStatus(t("downloadComplete"));
|
|
||||||
markOperationApplied(fileId, operationId);
|
|
||||||
} catch (error: any) {
|
|
||||||
console.error(error);
|
|
||||||
let errorMsg = t("error.pdfPassword", "An error occurred while compressing the PDF.");
|
|
||||||
if (error.response?.data && typeof error.response.data === 'string') {
|
|
||||||
errorMsg = error.response.data;
|
|
||||||
} else if (error.message) {
|
|
||||||
errorMsg = error.message;
|
|
||||||
}
|
|
||||||
setErrorMessage(errorMsg);
|
|
||||||
setStatus(t("error._value", "Compression failed."));
|
|
||||||
markOperationFailed(fileId, operationId, errorMsg);
|
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
|
||||||
}, [t, createOperation, buildFormData, recordOperation, markOperationApplied, markOperationFailed, addFiles]);
|
|
||||||
|
|
||||||
const resetResults = useCallback(() => {
|
|
||||||
cleanupBlobUrls();
|
|
||||||
setFiles([]);
|
|
||||||
setThumbnails([]);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
setDownloadUrl(null);
|
|
||||||
setStatus('');
|
|
||||||
setErrorMessage(null);
|
|
||||||
setIsLoading(false);
|
|
||||||
}, [cleanupBlobUrls]);
|
|
||||||
|
|
||||||
const clearError = useCallback(() => {
|
|
||||||
setErrorMessage(null);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return {
|
|
||||||
executeOperation,
|
|
||||||
files,
|
|
||||||
thumbnails,
|
|
||||||
isGeneratingThumbnails,
|
|
||||||
downloadUrl,
|
|
||||||
downloadFilename,
|
|
||||||
status,
|
|
||||||
errorMessage,
|
|
||||||
isLoading,
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults,
|
|
||||||
clearError,
|
|
||||||
};
|
|
||||||
};
|
};
|
||||||
|
@ -1,35 +1,13 @@
|
|||||||
import { useCallback, useState, useEffect } from 'react';
|
import { useCallback } from 'react';
|
||||||
import axios from 'axios';
|
import axios from 'axios';
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import { useFileContext } from '../../../contexts/FileContext';
|
|
||||||
import { FileOperation } from '../../../types/fileContext';
|
|
||||||
import { generateThumbnailForFile } from '../../../utils/thumbnailUtils';
|
|
||||||
import { ConvertParameters } from './useConvertParameters';
|
import { ConvertParameters } from './useConvertParameters';
|
||||||
import { detectFileExtension } from '../../../utils/fileUtils';
|
import { detectFileExtension } from '../../../utils/fileUtils';
|
||||||
import { createFileFromApiResponse } from '../../../utils/fileResponseUtils';
|
import { createFileFromApiResponse } from '../../../utils/fileResponseUtils';
|
||||||
|
import { useToolOperation, ToolOperationConfig } from '../shared/useToolOperation';
|
||||||
|
|
||||||
import { getEndpointUrl, isImageFormat, isWebFormat } from '../../../utils/convertUtils';
|
import { getEndpointUrl, isImageFormat, isWebFormat } from '../../../utils/convertUtils';
|
||||||
|
|
||||||
export interface ConvertOperationHook {
|
|
||||||
executeOperation: (
|
|
||||||
parameters: ConvertParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => Promise<void>;
|
|
||||||
|
|
||||||
// Flattened result properties for cleaner access
|
|
||||||
files: File[];
|
|
||||||
thumbnails: string[];
|
|
||||||
isGeneratingThumbnails: boolean;
|
|
||||||
downloadUrl: string | null;
|
|
||||||
downloadFilename: string;
|
|
||||||
status: string;
|
|
||||||
errorMessage: string | null;
|
|
||||||
isLoading: boolean;
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults: () => void;
|
|
||||||
clearError: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
const shouldProcessFilesSeparately = (
|
const shouldProcessFilesSeparately = (
|
||||||
selectedFiles: File[],
|
selectedFiles: File[],
|
||||||
@ -65,361 +43,70 @@ const createFileFromResponse = (
|
|||||||
return createFileFromApiResponse(responseData, headers, fallbackFilename);
|
return createFileFromApiResponse(responseData, headers, fallbackFilename);
|
||||||
};
|
};
|
||||||
|
|
||||||
const generateThumbnailsForFiles = async (files: File[]): Promise<string[]> => {
|
const buildFormData = (parameters: ConvertParameters, selectedFiles: File[]): FormData => {
|
||||||
const thumbnails: string[] = [];
|
const formData = new FormData();
|
||||||
|
|
||||||
for (const file of files) {
|
selectedFiles.forEach(file => {
|
||||||
try {
|
formData.append("fileInput", file);
|
||||||
const thumbnail = await generateThumbnailForFile(file);
|
});
|
||||||
thumbnails.push(thumbnail);
|
|
||||||
} catch (error) {
|
const { fromExtension, toExtension, imageOptions, htmlOptions, emailOptions, pdfaOptions } = parameters;
|
||||||
thumbnails.push('');
|
|
||||||
}
|
if (isImageFormat(toExtension)) {
|
||||||
|
formData.append("imageFormat", toExtension);
|
||||||
|
formData.append("colorType", imageOptions.colorType);
|
||||||
|
formData.append("dpi", imageOptions.dpi.toString());
|
||||||
|
formData.append("singleOrMultiple", imageOptions.singleOrMultiple);
|
||||||
|
} else if (fromExtension === 'pdf' && ['docx', 'odt'].includes(toExtension)) {
|
||||||
|
formData.append("outputFormat", toExtension);
|
||||||
|
} else if (fromExtension === 'pdf' && ['pptx', 'odp'].includes(toExtension)) {
|
||||||
|
formData.append("outputFormat", toExtension);
|
||||||
|
} else if (fromExtension === 'pdf' && ['txt', 'rtf'].includes(toExtension)) {
|
||||||
|
formData.append("outputFormat", toExtension);
|
||||||
|
} else if ((isImageFormat(fromExtension) || fromExtension === 'image') && toExtension === 'pdf') {
|
||||||
|
formData.append("fitOption", imageOptions.fitOption);
|
||||||
|
formData.append("colorType", imageOptions.colorType);
|
||||||
|
formData.append("autoRotate", imageOptions.autoRotate.toString());
|
||||||
|
} else if ((fromExtension === 'html' || fromExtension === 'zip') && toExtension === 'pdf') {
|
||||||
|
formData.append("zoom", htmlOptions.zoomLevel.toString());
|
||||||
|
} else if (fromExtension === 'eml' && toExtension === 'pdf') {
|
||||||
|
formData.append("includeAttachments", emailOptions.includeAttachments.toString());
|
||||||
|
formData.append("maxAttachmentSizeMB", emailOptions.maxAttachmentSizeMB.toString());
|
||||||
|
formData.append("downloadHtml", emailOptions.downloadHtml.toString());
|
||||||
|
formData.append("includeAllRecipients", emailOptions.includeAllRecipients.toString());
|
||||||
|
} else if (fromExtension === 'pdf' && toExtension === 'pdfa') {
|
||||||
|
formData.append("outputFormat", pdfaOptions.outputFormat);
|
||||||
|
} else if (fromExtension === 'pdf' && toExtension === 'csv') {
|
||||||
|
formData.append("pageNumbers", "all");
|
||||||
}
|
}
|
||||||
|
|
||||||
return thumbnails;
|
return formData;
|
||||||
};
|
};
|
||||||
|
|
||||||
const createDownloadInfo = async (files: File[]): Promise<{ url: string; filename: string }> => {
|
export const useConvertOperation = () => {
|
||||||
if (files.length === 1) {
|
|
||||||
const url = window.URL.createObjectURL(files[0]);
|
|
||||||
return { url, filename: files[0].name };
|
|
||||||
} else {
|
|
||||||
const JSZip = (await import('jszip')).default;
|
|
||||||
const zip = new JSZip();
|
|
||||||
|
|
||||||
files.forEach(file => {
|
|
||||||
zip.file(file.name, file);
|
|
||||||
});
|
|
||||||
|
|
||||||
const zipBlob = await zip.generateAsync({ type: 'blob' });
|
|
||||||
const zipUrl = window.URL.createObjectURL(zipBlob);
|
|
||||||
|
|
||||||
return { url: zipUrl, filename: 'converted_files.zip' };
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
export const useConvertOperation = (): ConvertOperationHook => {
|
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const {
|
|
||||||
recordOperation,
|
|
||||||
markOperationApplied,
|
|
||||||
markOperationFailed,
|
|
||||||
addFiles
|
|
||||||
} = useFileContext();
|
|
||||||
|
|
||||||
const [files, setFiles] = useState<File[]>([]);
|
|
||||||
const [thumbnails, setThumbnails] = useState<string[]>([]);
|
|
||||||
const [isGeneratingThumbnails, setIsGeneratingThumbnails] = useState(false);
|
|
||||||
const [downloadUrl, setDownloadUrl] = useState<string | null>(null);
|
|
||||||
const [downloadFilename, setDownloadFilename] = useState('');
|
|
||||||
const [status, setStatus] = useState('');
|
|
||||||
const [errorMessage, setErrorMessage] = useState<string | null>(null);
|
|
||||||
const [isLoading, setIsLoading] = useState(false);
|
|
||||||
|
|
||||||
const buildFormData = useCallback((
|
return useToolOperation<ConvertParameters>({
|
||||||
parameters: ConvertParameters,
|
operationType: 'convert',
|
||||||
selectedFiles: File[]
|
endpoint: (params) => getEndpointUrl(params.fromExtension, params.toExtension) || '',
|
||||||
) => {
|
buildFormData: buildFormData, // Clean multi-file signature: (params, selectedFiles) => FormData
|
||||||
const formData = new FormData();
|
filePrefix: 'converted_',
|
||||||
|
responseHandler: {
|
||||||
selectedFiles.forEach(file => {
|
type: 'single'
|
||||||
formData.append("fileInput", file);
|
},
|
||||||
});
|
validateParams: (params) => {
|
||||||
|
// Add any validation if needed
|
||||||
const { fromExtension, toExtension, imageOptions, htmlOptions, emailOptions, pdfaOptions } = parameters;
|
return { valid: true };
|
||||||
|
},
|
||||||
if (isImageFormat(toExtension)) {
|
getErrorMessage: (error) => {
|
||||||
formData.append("imageFormat", toExtension);
|
|
||||||
formData.append("colorType", imageOptions.colorType);
|
|
||||||
formData.append("dpi", imageOptions.dpi.toString());
|
|
||||||
formData.append("singleOrMultiple", imageOptions.singleOrMultiple);
|
|
||||||
} else if (fromExtension === 'pdf' && ['docx', 'odt'].includes(toExtension)) {
|
|
||||||
formData.append("outputFormat", toExtension);
|
|
||||||
} else if (fromExtension === 'pdf' && ['pptx', 'odp'].includes(toExtension)) {
|
|
||||||
formData.append("outputFormat", toExtension);
|
|
||||||
} else if (fromExtension === 'pdf' && ['txt', 'rtf'].includes(toExtension)) {
|
|
||||||
formData.append("outputFormat", toExtension);
|
|
||||||
} else if ((isImageFormat(fromExtension) || fromExtension === 'image') && toExtension === 'pdf') {
|
|
||||||
formData.append("fitOption", imageOptions.fitOption);
|
|
||||||
formData.append("colorType", imageOptions.colorType);
|
|
||||||
formData.append("autoRotate", imageOptions.autoRotate.toString());
|
|
||||||
} else if ((fromExtension === 'html' || fromExtension === 'zip') && toExtension === 'pdf') {
|
|
||||||
formData.append("zoom", htmlOptions.zoomLevel.toString());
|
|
||||||
} else if (fromExtension === 'eml' && toExtension === 'pdf') {
|
|
||||||
formData.append("includeAttachments", emailOptions.includeAttachments.toString());
|
|
||||||
formData.append("maxAttachmentSizeMB", emailOptions.maxAttachmentSizeMB.toString());
|
|
||||||
formData.append("downloadHtml", emailOptions.downloadHtml.toString());
|
|
||||||
formData.append("includeAllRecipients", emailOptions.includeAllRecipients.toString());
|
|
||||||
} else if (fromExtension === 'pdf' && toExtension === 'pdfa') {
|
|
||||||
formData.append("outputFormat", pdfaOptions.outputFormat);
|
|
||||||
} else if (fromExtension === 'pdf' && toExtension === 'csv') {
|
|
||||||
formData.append("pageNumbers", "all");
|
|
||||||
}
|
|
||||||
|
|
||||||
return formData;
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const createOperation = useCallback((
|
|
||||||
parameters: ConvertParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
): { operation: FileOperation; operationId: string; fileId: string } => {
|
|
||||||
const operationId = `convert-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
|
||||||
const fileId = selectedFiles[0].name;
|
|
||||||
|
|
||||||
const operation: FileOperation = {
|
|
||||||
id: operationId,
|
|
||||||
type: 'convert',
|
|
||||||
timestamp: Date.now(),
|
|
||||||
fileIds: selectedFiles.map(f => f.name),
|
|
||||||
status: 'pending',
|
|
||||||
metadata: {
|
|
||||||
originalFileName: selectedFiles[0].name,
|
|
||||||
parameters: {
|
|
||||||
fromExtension: parameters.fromExtension,
|
|
||||||
toExtension: parameters.toExtension,
|
|
||||||
imageOptions: parameters.imageOptions,
|
|
||||||
htmlOptions: parameters.htmlOptions,
|
|
||||||
emailOptions: parameters.emailOptions,
|
|
||||||
pdfaOptions: parameters.pdfaOptions,
|
|
||||||
},
|
|
||||||
fileSize: selectedFiles[0].size
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
return { operation, operationId, fileId };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const processResults = useCallback(async (blob: Blob, filename: string) => {
|
|
||||||
try {
|
|
||||||
// For single file conversions, create a file directly
|
|
||||||
const convertedFile = new File([blob], filename, { type: blob.type });
|
|
||||||
|
|
||||||
// Set local state for preview
|
|
||||||
setFiles([convertedFile]);
|
|
||||||
setThumbnails([]);
|
|
||||||
setIsGeneratingThumbnails(true);
|
|
||||||
|
|
||||||
// Add converted file to FileContext for future use
|
|
||||||
await addFiles([convertedFile]);
|
|
||||||
|
|
||||||
// Generate thumbnail for preview
|
|
||||||
try {
|
|
||||||
const thumbnail = await generateThumbnailForFile(convertedFile);
|
|
||||||
setThumbnails([thumbnail]);
|
|
||||||
} catch (error) {
|
|
||||||
console.warn(`Failed to generate thumbnail for ${filename}:`, error);
|
|
||||||
setThumbnails(['']);
|
|
||||||
}
|
|
||||||
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
} catch (error) {
|
|
||||||
console.warn('Failed to process conversion result:', error);
|
|
||||||
}
|
|
||||||
}, [addFiles]);
|
|
||||||
|
|
||||||
const executeOperation = useCallback(async (
|
|
||||||
parameters: ConvertParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
if (selectedFiles.length === 0) {
|
|
||||||
setStatus(t("noFileSelected"));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (shouldProcessFilesSeparately(selectedFiles, parameters)) {
|
|
||||||
await executeMultipleSeparateFiles(parameters, selectedFiles);
|
|
||||||
} else {
|
|
||||||
await executeSingleCombinedOperation(parameters, selectedFiles);
|
|
||||||
}
|
|
||||||
}, [t]);
|
|
||||||
|
|
||||||
const executeMultipleSeparateFiles = async (
|
|
||||||
parameters: ConvertParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
setStatus(t("loading"));
|
|
||||||
setIsLoading(true);
|
|
||||||
setErrorMessage(null);
|
|
||||||
|
|
||||||
const results: File[] = [];
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Process each file separately
|
|
||||||
for (let i = 0; i < selectedFiles.length; i++) {
|
|
||||||
const file = selectedFiles[i];
|
|
||||||
setStatus(t("convert.processingFile", `Processing file ${i + 1} of ${selectedFiles.length}...`));
|
|
||||||
|
|
||||||
const fileExtension = detectFileExtension(file.name);
|
|
||||||
let endpoint = getEndpointUrl(fileExtension, parameters.toExtension);
|
|
||||||
let fileSpecificParams = { ...parameters, fromExtension: fileExtension };
|
|
||||||
if (!endpoint && parameters.toExtension === 'pdf') {
|
|
||||||
endpoint = '/api/v1/convert/file/pdf';
|
|
||||||
console.log(`Using file-to-pdf fallback for ${fileExtension} file: ${file.name}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!endpoint) {
|
|
||||||
console.error(`No endpoint available for ${fileExtension} to ${parameters.toExtension}`);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const { operation, operationId, fileId } = createOperation(fileSpecificParams, [file]);
|
|
||||||
const formData = buildFormData(fileSpecificParams, [file]);
|
|
||||||
|
|
||||||
recordOperation(fileId, operation);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await axios.post(endpoint, formData, { responseType: "blob" });
|
|
||||||
|
|
||||||
// Use utility function to create file from response
|
|
||||||
const convertedFile = createFileFromResponse(
|
|
||||||
response.data,
|
|
||||||
response.headers,
|
|
||||||
file.name,
|
|
||||||
parameters.toExtension
|
|
||||||
);
|
|
||||||
results.push(convertedFile);
|
|
||||||
|
|
||||||
markOperationApplied(fileId, operationId);
|
|
||||||
} catch (error: any) {
|
|
||||||
console.error(`Error converting file ${file.name}:`, error);
|
|
||||||
markOperationFailed(fileId, operationId);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (results.length > 0) {
|
|
||||||
|
|
||||||
const generatedThumbnails = await generateThumbnailsForFiles(results);
|
|
||||||
|
|
||||||
setFiles(results);
|
|
||||||
setThumbnails(generatedThumbnails);
|
|
||||||
|
|
||||||
await addFiles(results);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { url, filename } = await createDownloadInfo(results);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setDownloadFilename(filename);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Failed to create download info:', error);
|
|
||||||
const url = window.URL.createObjectURL(results[0]);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setDownloadFilename(results[0].name);
|
|
||||||
}
|
|
||||||
setStatus(t("convert.multipleFilesComplete", `Converted ${results.length} files successfully`));
|
|
||||||
} else {
|
|
||||||
setErrorMessage(t("convert.errorAllFilesFailed", "All files failed to convert"));
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error in multiple operations:', error);
|
|
||||||
setErrorMessage(t("convert.errorMultipleConversion", "An error occurred while converting multiple files"));
|
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
const executeSingleCombinedOperation = async (
|
|
||||||
parameters: ConvertParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
const { operation, operationId, fileId } = createOperation(parameters, selectedFiles);
|
|
||||||
const formData = buildFormData(parameters, selectedFiles);
|
|
||||||
|
|
||||||
// Get endpoint using utility function
|
|
||||||
const endpoint = getEndpointUrl(parameters.fromExtension, parameters.toExtension);
|
|
||||||
if (!endpoint) {
|
|
||||||
setErrorMessage(t("convert.errorNotSupported", { from: parameters.fromExtension, to: parameters.toExtension }));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
recordOperation(fileId, operation);
|
|
||||||
|
|
||||||
setStatus(t("loading"));
|
|
||||||
setIsLoading(true);
|
|
||||||
setErrorMessage(null);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await axios.post(endpoint, formData, { responseType: "blob" });
|
|
||||||
|
|
||||||
// Use utility function to create file from response
|
|
||||||
const originalFileName = selectedFiles.length === 1
|
|
||||||
? selectedFiles[0].name
|
|
||||||
: 'combined_files.pdf'; // Default extension for combined files
|
|
||||||
|
|
||||||
const convertedFile = createFileFromResponse(
|
|
||||||
response.data,
|
|
||||||
response.headers,
|
|
||||||
originalFileName,
|
|
||||||
parameters.toExtension
|
|
||||||
);
|
|
||||||
|
|
||||||
const url = window.URL.createObjectURL(convertedFile);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setDownloadFilename(convertedFile.name);
|
|
||||||
setStatus(t("downloadComplete"));
|
|
||||||
|
|
||||||
await processResults(new Blob([convertedFile]), convertedFile.name);
|
|
||||||
markOperationApplied(fileId, operationId);
|
|
||||||
} catch (error: any) {
|
|
||||||
console.error(error);
|
|
||||||
let errorMsg = t("convert.errorConversion", "An error occurred while converting the file.");
|
|
||||||
if (error.response?.data && typeof error.response.data === 'string') {
|
if (error.response?.data && typeof error.response.data === 'string') {
|
||||||
errorMsg = error.response.data;
|
return error.response.data;
|
||||||
} else if (error.message) {
|
|
||||||
errorMsg = error.message;
|
|
||||||
}
|
}
|
||||||
setErrorMessage(errorMsg);
|
if (error.message) {
|
||||||
markOperationFailed(fileId, operationId, errorMsg);
|
return error.message;
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
|
|
||||||
const resetResults = useCallback(() => {
|
|
||||||
// Clean up blob URLs to prevent memory leaks
|
|
||||||
if (downloadUrl) {
|
|
||||||
window.URL.revokeObjectURL(downloadUrl);
|
|
||||||
}
|
|
||||||
|
|
||||||
setFiles([]);
|
|
||||||
setThumbnails([]);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
setDownloadUrl(null);
|
|
||||||
setDownloadFilename('');
|
|
||||||
setStatus('');
|
|
||||||
setErrorMessage(null);
|
|
||||||
setIsLoading(false);
|
|
||||||
}, [downloadUrl]);
|
|
||||||
|
|
||||||
const clearError = useCallback(() => {
|
|
||||||
setErrorMessage(null);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
// Cleanup blob URLs on unmount to prevent memory leaks
|
|
||||||
useEffect(() => {
|
|
||||||
return () => {
|
|
||||||
if (downloadUrl) {
|
|
||||||
window.URL.revokeObjectURL(downloadUrl);
|
|
||||||
}
|
}
|
||||||
};
|
return t("convert.errorConversion", "An error occurred while converting the file.");
|
||||||
}, [downloadUrl]);
|
}
|
||||||
|
});
|
||||||
return {
|
|
||||||
executeOperation,
|
|
||||||
|
|
||||||
// Flattened result properties for cleaner access
|
|
||||||
files,
|
|
||||||
thumbnails,
|
|
||||||
isGeneratingThumbnails,
|
|
||||||
downloadUrl,
|
|
||||||
downloadFilename,
|
|
||||||
status,
|
|
||||||
errorMessage,
|
|
||||||
isLoading,
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults,
|
|
||||||
clearError,
|
|
||||||
};
|
|
||||||
};
|
};
|
@ -1,9 +1,9 @@
|
|||||||
import { useState, useCallback } from 'react';
|
import { useCallback } from 'react';
|
||||||
import axios from 'axios';
|
import axios from 'axios';
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import { useFileContext } from '../../../contexts/FileContext';
|
|
||||||
import { FileOperation } from '../../../types/fileContext';
|
|
||||||
import { OCRParameters } from '../../../components/tools/ocr/OCRSettings';
|
import { OCRParameters } from '../../../components/tools/ocr/OCRSettings';
|
||||||
|
import { useToolOperation, ToolOperationConfig } from '../shared/useToolOperation';
|
||||||
|
import { createStandardErrorHandler } from '../../../utils/toolErrorHandler';
|
||||||
|
|
||||||
//Extract files from a ZIP blob
|
//Extract files from a ZIP blob
|
||||||
async function extractZipFile(zipBlob: Blob): Promise<File[]> {
|
async function extractZipFile(zipBlob: Blob): Promise<File[]> {
|
||||||
@ -41,332 +41,155 @@ function getMimeType(filename: string): string {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface OCROperationHook {
|
const buildFormData = (parameters: OCRParameters, file: File): FormData => {
|
||||||
files: File[];
|
const formData = new FormData();
|
||||||
thumbnails: string[];
|
|
||||||
downloadUrl: string | null;
|
|
||||||
downloadFilename: string | null;
|
|
||||||
isLoading: boolean;
|
|
||||||
isGeneratingThumbnails: boolean;
|
|
||||||
status: string;
|
|
||||||
errorMessage: string | null;
|
|
||||||
executeOperation: (parameters: OCRParameters, selectedFiles: File[]) => Promise<void>;
|
|
||||||
resetResults: () => void;
|
|
||||||
clearError: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
export const useOCROperation = (): OCROperationHook => {
|
// Add the file
|
||||||
|
formData.append('fileInput', file);
|
||||||
|
|
||||||
|
// Add languages as multiple parameters with same name (like checkboxes)
|
||||||
|
parameters.languages.forEach(lang => {
|
||||||
|
formData.append('languages', lang);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Add other parameters
|
||||||
|
formData.append('ocrType', parameters.ocrType);
|
||||||
|
formData.append('ocrRenderType', parameters.ocrRenderType);
|
||||||
|
|
||||||
|
// Handle additional options - convert array to individual boolean parameters
|
||||||
|
formData.append('sidecar', parameters.additionalOptions.includes('sidecar').toString());
|
||||||
|
formData.append('deskew', parameters.additionalOptions.includes('deskew').toString());
|
||||||
|
formData.append('clean', parameters.additionalOptions.includes('clean').toString());
|
||||||
|
formData.append('cleanFinal', parameters.additionalOptions.includes('cleanFinal').toString());
|
||||||
|
formData.append('removeImagesAfter', parameters.additionalOptions.includes('removeImagesAfter').toString());
|
||||||
|
|
||||||
|
return formData;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useOCROperation = () => {
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const {
|
|
||||||
recordOperation,
|
const customOCRProcessor = useCallback(async (
|
||||||
markOperationApplied,
|
|
||||||
markOperationFailed,
|
|
||||||
addFiles
|
|
||||||
} = useFileContext();
|
|
||||||
|
|
||||||
// Internal state management
|
|
||||||
const [files, setFiles] = useState<File[]>([]);
|
|
||||||
const [thumbnails, setThumbnails] = useState<string[]>([]);
|
|
||||||
const [isGeneratingThumbnails, setIsGeneratingThumbnails] = useState(false);
|
|
||||||
const [downloadUrl, setDownloadUrl] = useState<string | null>(null);
|
|
||||||
const [downloadFilename, setDownloadFilename] = useState<string>('');
|
|
||||||
const [status, setStatus] = useState('');
|
|
||||||
const [errorMessage, setErrorMessage] = useState<string | null>(null);
|
|
||||||
const [isLoading, setIsLoading] = useState(false);
|
|
||||||
|
|
||||||
// Track blob URLs for cleanup
|
|
||||||
const [blobUrls, setBlobUrls] = useState<string[]>([]);
|
|
||||||
|
|
||||||
const cleanupBlobUrls = useCallback(() => {
|
|
||||||
blobUrls.forEach(url => {
|
|
||||||
try {
|
|
||||||
URL.revokeObjectURL(url);
|
|
||||||
} catch (error) {
|
|
||||||
console.warn('Failed to revoke blob URL:', error);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
setBlobUrls([]);
|
|
||||||
}, [blobUrls]);
|
|
||||||
|
|
||||||
const buildFormData = useCallback((
|
|
||||||
parameters: OCRParameters,
|
|
||||||
file: File
|
|
||||||
) => {
|
|
||||||
const formData = new FormData();
|
|
||||||
|
|
||||||
// Add the file
|
|
||||||
formData.append('fileInput', file);
|
|
||||||
|
|
||||||
// Add languages as multiple parameters with same name (like checkboxes)
|
|
||||||
parameters.languages.forEach(lang => {
|
|
||||||
formData.append('languages', lang);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Add other parameters
|
|
||||||
formData.append('ocrType', parameters.ocrType);
|
|
||||||
formData.append('ocrRenderType', parameters.ocrRenderType);
|
|
||||||
|
|
||||||
// Handle additional options - convert array to individual boolean parameters
|
|
||||||
formData.append('sidecar', parameters.additionalOptions.includes('sidecar').toString());
|
|
||||||
formData.append('deskew', parameters.additionalOptions.includes('deskew').toString());
|
|
||||||
formData.append('clean', parameters.additionalOptions.includes('clean').toString());
|
|
||||||
formData.append('cleanFinal', parameters.additionalOptions.includes('cleanFinal').toString());
|
|
||||||
formData.append('removeImagesAfter', parameters.additionalOptions.includes('removeImagesAfter').toString());
|
|
||||||
|
|
||||||
const endpoint = '/api/v1/misc/ocr-pdf';
|
|
||||||
|
|
||||||
return { formData, endpoint };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const createOperation = useCallback((
|
|
||||||
parameters: OCRParameters,
|
parameters: OCRParameters,
|
||||||
selectedFiles: File[]
|
selectedFiles: File[]
|
||||||
): { operation: FileOperation; operationId: string; fileId: string } => {
|
): Promise<File[]> => {
|
||||||
const operationId = `ocr-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
|
||||||
const fileId = selectedFiles.map(f => f.name).join(',');
|
|
||||||
|
|
||||||
const operation: FileOperation = {
|
|
||||||
id: operationId,
|
|
||||||
type: 'ocr',
|
|
||||||
timestamp: Date.now(),
|
|
||||||
fileIds: selectedFiles.map(f => f.name),
|
|
||||||
status: 'pending',
|
|
||||||
metadata: {
|
|
||||||
originalFileName: selectedFiles[0]?.name,
|
|
||||||
parameters: {
|
|
||||||
languages: parameters.languages,
|
|
||||||
ocrType: parameters.ocrType,
|
|
||||||
ocrRenderType: parameters.ocrRenderType,
|
|
||||||
additionalOptions: parameters.additionalOptions,
|
|
||||||
},
|
|
||||||
fileSize: selectedFiles.reduce((sum, f) => sum + f.size, 0)
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
return { operation, operationId, fileId };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const executeOperation = useCallback(async (
|
|
||||||
parameters: OCRParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
if (selectedFiles.length === 0) {
|
|
||||||
setStatus(t("noFileSelected") || "No file selected");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (parameters.languages.length === 0) {
|
if (parameters.languages.length === 0) {
|
||||||
setErrorMessage('Please select at least one language for OCR processing.');
|
throw new Error(t('ocr.validation.languageRequired', 'Please select at least one language for OCR processing.'));
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const validFiles = selectedFiles.filter(file => file.size > 0);
|
const processedFiles: File[] = [];
|
||||||
if (validFiles.length === 0) {
|
const failedFiles: string[] = [];
|
||||||
setErrorMessage('No valid files to process. All selected files are empty.');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (validFiles.length < selectedFiles.length) {
|
// OCR typically processes one file at a time
|
||||||
console.warn(`Skipping ${selectedFiles.length - validFiles.length} empty files`);
|
for (let i = 0; i < selectedFiles.length; i++) {
|
||||||
}
|
const file = selectedFiles[i];
|
||||||
|
|
||||||
const { operation, operationId, fileId } = createOperation(parameters, selectedFiles);
|
try {
|
||||||
|
const formData = buildFormData(file, parameters);
|
||||||
|
const response = await axios.post('/api/v1/misc/ocr-pdf', formData, {
|
||||||
|
responseType: "blob",
|
||||||
|
timeout: 300000 // 5 minute timeout for OCR
|
||||||
|
});
|
||||||
|
|
||||||
recordOperation(fileId, operation);
|
// Check for HTTP errors
|
||||||
|
if (response.status >= 400) {
|
||||||
setStatus(t("loading") || "Loading...");
|
const errorText = await response.data.text();
|
||||||
setIsLoading(true);
|
throw new Error(`OCR service HTTP error ${response.status}: ${errorText.substring(0, 300)}`);
|
||||||
setErrorMessage(null);
|
|
||||||
setFiles([]);
|
|
||||||
setThumbnails([]);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const processedFiles: File[] = [];
|
|
||||||
const failedFiles: string[] = [];
|
|
||||||
|
|
||||||
// OCR typically processes one file at a time
|
|
||||||
for (let i = 0; i < validFiles.length; i++) {
|
|
||||||
const file = validFiles[i];
|
|
||||||
setStatus(`Processing OCR for ${file.name} (${i + 1}/${validFiles.length})`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const { formData, endpoint } = buildFormData(parameters, file);
|
|
||||||
const response = await axios.post(endpoint, formData, {
|
|
||||||
responseType: "blob",
|
|
||||||
timeout: 300000 // 5 minute timeout for OCR
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check for HTTP errors
|
|
||||||
if (response.status >= 400) {
|
|
||||||
// Try to read error response as text
|
|
||||||
const errorText = await response.data.text();
|
|
||||||
throw new Error(`OCR service HTTP error ${response.status}: ${errorText.substring(0, 300)}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Validate response
|
|
||||||
if (!response.data || response.data.size === 0) {
|
|
||||||
throw new Error('Empty response from OCR service');
|
|
||||||
}
|
|
||||||
|
|
||||||
const contentType = response.headers['content-type'] || 'application/pdf';
|
|
||||||
|
|
||||||
// Check if response is actually a PDF by examining the first few bytes
|
|
||||||
const arrayBuffer = await response.data.arrayBuffer();
|
|
||||||
const uint8Array = new Uint8Array(arrayBuffer);
|
|
||||||
const header = new TextDecoder().decode(uint8Array.slice(0, 4));
|
|
||||||
|
|
||||||
// Check if it's a ZIP file (OCR service returns ZIP when sidecar is enabled or for multi-file results)
|
|
||||||
if (header.startsWith('PK')) {
|
|
||||||
try {
|
|
||||||
// Extract ZIP file contents
|
|
||||||
const zipFiles = await extractZipFile(response.data);
|
|
||||||
|
|
||||||
// Add extracted files to processed files
|
|
||||||
processedFiles.push(...zipFiles);
|
|
||||||
} catch (extractError) {
|
|
||||||
// Fallback to treating as single ZIP file
|
|
||||||
const blob = new Blob([response.data], { type: 'application/zip' });
|
|
||||||
const processedFile = new File([blob], `ocr_${file.name}.zip`, { type: 'application/zip' });
|
|
||||||
processedFiles.push(processedFile);
|
|
||||||
}
|
|
||||||
continue; // Skip the PDF validation for ZIP files
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!header.startsWith('%PDF')) {
|
|
||||||
// Check if it's an error response
|
|
||||||
const text = new TextDecoder().decode(uint8Array.slice(0, 500));
|
|
||||||
|
|
||||||
if (text.includes('error') || text.includes('Error') || text.includes('exception') || text.includes('html')) {
|
|
||||||
// Check for specific OCR tool unavailable error
|
|
||||||
if (text.includes('OCR tools') && text.includes('not installed')) {
|
|
||||||
throw new Error('OCR tools (OCRmyPDF or Tesseract) are not installed on the server. Use the standard or fat Docker image instead of ultra-lite, or install OCR tools manually.');
|
|
||||||
}
|
|
||||||
throw new Error(`OCR service error: ${text.substring(0, 300)}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if it's an HTML error page
|
|
||||||
if (text.includes('<html') || text.includes('<!DOCTYPE')) {
|
|
||||||
// Try to extract error message from HTML
|
|
||||||
const errorMatch = text.match(/<title[^>]*>([^<]+)<\/title>/i) ||
|
|
||||||
text.match(/<h1[^>]*>([^<]+)<\/h1>/i) ||
|
|
||||||
text.match(/<body[^>]*>([^<]+)<\/body>/i);
|
|
||||||
const errorMessage = errorMatch ? errorMatch[1].trim() : 'Unknown error';
|
|
||||||
throw new Error(`OCR service error: ${errorMessage}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
throw new Error(`Response is not a valid PDF file. Header: "${header}"`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const blob = new Blob([response.data], { type: contentType });
|
|
||||||
const processedFile = new File([blob], `ocr_${file.name}`, { type: contentType });
|
|
||||||
|
|
||||||
processedFiles.push(processedFile);
|
|
||||||
} catch (fileError) {
|
|
||||||
const errorMessage = fileError instanceof Error ? fileError.message : 'Unknown error';
|
|
||||||
failedFiles.push(`${file.name} (${errorMessage})`);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
if (failedFiles.length > 0 && processedFiles.length === 0) {
|
// Validate response
|
||||||
throw new Error(`Failed to process OCR for all files: ${failedFiles.join(', ')}`);
|
if (!response.data || response.data.size === 0) {
|
||||||
}
|
throw new Error('Empty response from OCR service');
|
||||||
|
}
|
||||||
|
|
||||||
if (failedFiles.length > 0) {
|
const contentType = response.headers['content-type'] || 'application/pdf';
|
||||||
setStatus(`Processed ${processedFiles.length}/${validFiles.length} files. Failed: ${failedFiles.join(', ')}`);
|
|
||||||
} else {
|
|
||||||
const hasPdfFiles = processedFiles.some(file => file.name.endsWith('.pdf'));
|
|
||||||
const hasTxtFiles = processedFiles.some(file => file.name.endsWith('.txt'));
|
|
||||||
let statusMessage = `OCR completed successfully for ${processedFiles.length} file(s)`;
|
|
||||||
|
|
||||||
if (hasPdfFiles && hasTxtFiles) {
|
// Check if response is actually a PDF by examining the first few bytes
|
||||||
statusMessage += ' (Extracted PDF and text files)';
|
const arrayBuffer = await response.data.arrayBuffer();
|
||||||
} else if (hasPdfFiles) {
|
const uint8Array = new Uint8Array(arrayBuffer);
|
||||||
statusMessage += ' (Extracted PDF files)';
|
const header = new TextDecoder().decode(uint8Array.slice(0, 4));
|
||||||
} else if (hasTxtFiles) {
|
|
||||||
statusMessage += ' (Extracted text files)';
|
// Check if it's a ZIP file (OCR service returns ZIP when sidecar is enabled or for multi-file results)
|
||||||
|
if (header.startsWith('PK')) {
|
||||||
|
try {
|
||||||
|
// Extract ZIP file contents
|
||||||
|
const zipFiles = await extractZipFile(response.data);
|
||||||
|
|
||||||
|
// Add extracted files to processed files
|
||||||
|
processedFiles.push(...zipFiles);
|
||||||
|
} catch (extractError) {
|
||||||
|
// Fallback to treating as single ZIP file
|
||||||
|
const blob = new Blob([response.data], { type: 'application/zip' });
|
||||||
|
const processedFile = new File([blob], `ocr_${file.name}.zip`, { type: 'application/zip' });
|
||||||
|
processedFiles.push(processedFile);
|
||||||
|
}
|
||||||
|
continue; // Skip the PDF validation for ZIP files
|
||||||
}
|
}
|
||||||
|
|
||||||
setStatus(statusMessage);
|
if (!header.startsWith('%PDF')) {
|
||||||
}
|
// Check if it's an error response
|
||||||
|
const text = new TextDecoder().decode(uint8Array.slice(0, 500));
|
||||||
setFiles(processedFiles);
|
|
||||||
setIsGeneratingThumbnails(true);
|
|
||||||
|
|
||||||
await addFiles(processedFiles);
|
|
||||||
|
|
||||||
// Cleanup old blob URLs
|
|
||||||
cleanupBlobUrls();
|
|
||||||
|
|
||||||
// Create download URL - for multiple files, we'll create a new ZIP
|
|
||||||
if (processedFiles.length === 1) {
|
|
||||||
const url = window.URL.createObjectURL(processedFiles[0]);
|
|
||||||
setDownloadUrl(url);
|
|
||||||
setBlobUrls([url]);
|
|
||||||
setDownloadFilename(processedFiles[0].name);
|
|
||||||
} else {
|
|
||||||
// For multiple files, create a new ZIP containing all extracted files
|
|
||||||
try {
|
|
||||||
const JSZip = await import('jszip');
|
|
||||||
const zip = new JSZip.default();
|
|
||||||
|
|
||||||
for (const file of processedFiles) {
|
if (text.includes('error') || text.includes('Error') || text.includes('exception') || text.includes('html')) {
|
||||||
zip.file(file.name, file);
|
// Check for specific OCR tool unavailable error
|
||||||
|
if (text.includes('OCR tools') && text.includes('not installed')) {
|
||||||
|
throw new Error('OCR tools (OCRmyPDF or Tesseract) are not installed on the server. Use the standard or fat Docker image instead of ultra-lite, or install OCR tools manually.');
|
||||||
|
}
|
||||||
|
throw new Error(`OCR service error: ${text.substring(0, 300)}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const zipBlob = await zip.generateAsync({ type: 'blob' });
|
// Check if it's an HTML error page
|
||||||
const url = window.URL.createObjectURL(zipBlob);
|
if (text.includes('<html') || text.includes('<!DOCTYPE')) {
|
||||||
setDownloadUrl(url);
|
// Try to extract error message from HTML
|
||||||
setBlobUrls([url]);
|
const errorMatch = text.match(/<title[^>]*>([^<]+)<\/title>/i) ||
|
||||||
setDownloadFilename(`ocr_extracted_files.zip`);
|
text.match(/<h1[^>]*>([^<]+)<\/h1>/i) ||
|
||||||
} catch (zipError) {
|
text.match(/<body[^>]*>([^<]+)<\/body>/i);
|
||||||
// Fallback to first file
|
const errorMessage = errorMatch ? errorMatch[1].trim() : t('ocr.error.unknown', 'Unknown error');
|
||||||
const url = window.URL.createObjectURL(processedFiles[0]);
|
throw new Error(`OCR service error: ${errorMessage}`);
|
||||||
setDownloadUrl(url);
|
}
|
||||||
setBlobUrls([url]);
|
|
||||||
setDownloadFilename(processedFiles[0].name);
|
throw new Error(`Response is not a valid PDF file. Header: "${header}"`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const blob = new Blob([response.data], { type: contentType });
|
||||||
|
const processedFile = new File([blob], `ocr_${file.name}`, { type: contentType });
|
||||||
|
|
||||||
|
processedFiles.push(processedFile);
|
||||||
|
} catch (fileError) {
|
||||||
|
const errorMessage = fileError instanceof Error ? fileError.message : t('ocr.error.unknown', 'Unknown error');
|
||||||
|
failedFiles.push(`${file.name} (${errorMessage})`);
|
||||||
}
|
}
|
||||||
|
|
||||||
markOperationApplied(fileId, operationId);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('OCR operation error:', error);
|
|
||||||
const errorMessage = error instanceof Error ? error.message : 'OCR operation failed';
|
|
||||||
setErrorMessage(errorMessage);
|
|
||||||
setStatus('');
|
|
||||||
markOperationFailed(fileId, operationId, errorMessage);
|
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
}
|
||||||
}, [buildFormData, createOperation, recordOperation, addFiles, cleanupBlobUrls, markOperationApplied, markOperationFailed, t]);
|
|
||||||
|
|
||||||
const resetResults = useCallback(() => {
|
if (failedFiles.length > 0 && processedFiles.length === 0) {
|
||||||
setFiles([]);
|
throw new Error(`Failed to process OCR for all files: ${failedFiles.join(', ')}`);
|
||||||
setThumbnails([]);
|
}
|
||||||
setDownloadUrl(null);
|
|
||||||
setDownloadFilename('');
|
|
||||||
setStatus('');
|
|
||||||
setErrorMessage(null);
|
|
||||||
setIsLoading(false);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
cleanupBlobUrls();
|
|
||||||
}, [cleanupBlobUrls]);
|
|
||||||
|
|
||||||
const clearError = useCallback(() => {
|
return processedFiles;
|
||||||
setErrorMessage(null);
|
}, [t]);
|
||||||
}, []);
|
|
||||||
|
|
||||||
return {
|
const ocrConfig: ToolOperationConfig<OCRParameters> = {
|
||||||
files,
|
operationType: 'ocr',
|
||||||
thumbnails,
|
endpoint: '/api/v1/misc/ocr-pdf', // Not used with customProcessor but required
|
||||||
downloadUrl,
|
buildFormData, // Not used with customProcessor but required
|
||||||
downloadFilename,
|
filePrefix: 'ocr_',
|
||||||
isLoading,
|
customProcessor: customOCRProcessor,
|
||||||
isGeneratingThumbnails,
|
timeout: 300000, // 5 minute timeout for OCR
|
||||||
status,
|
validateParams: (params) => {
|
||||||
errorMessage,
|
if (params.languages.length === 0) {
|
||||||
executeOperation,
|
return { valid: false, errors: [t('ocr.validation.languageRequired', 'Please select at least one language for OCR processing.')] };
|
||||||
resetResults,
|
}
|
||||||
clearError,
|
return { valid: true };
|
||||||
|
},
|
||||||
|
getErrorMessage: (error) => {
|
||||||
|
// Handle OCR-specific error first
|
||||||
|
if (error.message?.includes('OCR tools') && error.message?.includes('not installed')) {
|
||||||
|
return 'OCR tools (OCRmyPDF or Tesseract) are not installed on the server. Use the standard or fat Docker image instead of ultra-lite, or install OCR tools manually.';
|
||||||
|
}
|
||||||
|
// Fall back to standard error handling
|
||||||
|
return createStandardErrorHandler(t('ocr.error.failed', 'OCR operation failed'))(error);
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
return useToolOperation(ocrConfig);
|
||||||
};
|
};
|
87
frontend/src/hooks/tools/shared/useToolApiCalls.ts
Normal file
87
frontend/src/hooks/tools/shared/useToolApiCalls.ts
Normal file
@ -0,0 +1,87 @@
|
|||||||
|
import { useCallback, useRef } from 'react';
|
||||||
|
import axios, { CancelTokenSource } from 'axios';
|
||||||
|
import { processResponse } from '../../../utils/toolResponseProcessor';
|
||||||
|
import type { ResponseHandler, ProcessingProgress } from './useToolState';
|
||||||
|
|
||||||
|
export interface ApiCallsConfig<TParams = void> {
|
||||||
|
endpoint: string | ((params: TParams) => string);
|
||||||
|
buildFormData: (file: File, params: TParams) => FormData;
|
||||||
|
filePrefix: string;
|
||||||
|
responseHandler?: ResponseHandler;
|
||||||
|
timeout?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const useToolApiCalls = <TParams = void>() => {
|
||||||
|
const cancelTokenRef = useRef<CancelTokenSource | null>(null);
|
||||||
|
|
||||||
|
const processFiles = useCallback(async (
|
||||||
|
params: TParams,
|
||||||
|
validFiles: File[],
|
||||||
|
config: ApiCallsConfig<TParams>,
|
||||||
|
onProgress: (progress: ProcessingProgress) => void,
|
||||||
|
onStatus: (status: string) => void
|
||||||
|
): Promise<File[]> => {
|
||||||
|
const processedFiles: File[] = [];
|
||||||
|
const failedFiles: string[] = [];
|
||||||
|
const total = validFiles.length;
|
||||||
|
|
||||||
|
// Create cancel token for this operation
|
||||||
|
cancelTokenRef.current = axios.CancelToken.source();
|
||||||
|
|
||||||
|
for (let i = 0; i < validFiles.length; i++) {
|
||||||
|
const file = validFiles[i];
|
||||||
|
|
||||||
|
onProgress({ current: i + 1, total, currentFileName: file.name });
|
||||||
|
onStatus(`Processing ${file.name} (${i + 1}/${total})`);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const formData = config.buildFormData(file, params);
|
||||||
|
const endpoint = typeof config.endpoint === 'function' ? config.endpoint(params) : config.endpoint;
|
||||||
|
const response = await axios.post(endpoint, formData, {
|
||||||
|
responseType: 'blob',
|
||||||
|
timeout: config.timeout || 120000,
|
||||||
|
cancelToken: cancelTokenRef.current.token
|
||||||
|
});
|
||||||
|
|
||||||
|
const responseFiles = await processResponse(
|
||||||
|
response.data,
|
||||||
|
[file],
|
||||||
|
config.filePrefix,
|
||||||
|
config.responseHandler
|
||||||
|
);
|
||||||
|
processedFiles.push(...responseFiles);
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
if (axios.isCancel(error)) {
|
||||||
|
throw new Error('Operation was cancelled');
|
||||||
|
}
|
||||||
|
console.error(`Failed to process ${file.name}:`, error);
|
||||||
|
failedFiles.push(file.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (failedFiles.length > 0 && processedFiles.length === 0) {
|
||||||
|
throw new Error(`Failed to process all files: ${failedFiles.join(', ')}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (failedFiles.length > 0) {
|
||||||
|
onStatus(`Processed ${processedFiles.length}/${total} files. Failed: ${failedFiles.join(', ')}`);
|
||||||
|
} else {
|
||||||
|
onStatus(`Successfully processed ${processedFiles.length} file${processedFiles.length === 1 ? '' : 's'}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return processedFiles;
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const cancelOperation = useCallback(() => {
|
||||||
|
if (cancelTokenRef.current) {
|
||||||
|
cancelTokenRef.current.cancel('Operation cancelled by user');
|
||||||
|
cancelTokenRef.current = null;
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
processFiles,
|
||||||
|
cancelOperation,
|
||||||
|
};
|
||||||
|
};
|
241
frontend/src/hooks/tools/shared/useToolOperation.ts
Normal file
241
frontend/src/hooks/tools/shared/useToolOperation.ts
Normal file
@ -0,0 +1,241 @@
|
|||||||
|
import { useCallback } from 'react';
|
||||||
|
import axios from 'axios';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { useFileContext } from '../../../contexts/FileContext';
|
||||||
|
import { useToolState, type ProcessingProgress } from './useToolState';
|
||||||
|
import { useToolApiCalls, type ApiCallsConfig } from './useToolApiCalls';
|
||||||
|
import { useToolResources } from './useToolResources';
|
||||||
|
import { extractErrorMessage } from '../../../utils/toolErrorHandler';
|
||||||
|
import { createOperation } from '../../../utils/toolOperationTracker';
|
||||||
|
import type { ResponseHandler } from '../../../utils/toolResponseProcessor';
|
||||||
|
|
||||||
|
export interface ValidationResult {
|
||||||
|
valid: boolean;
|
||||||
|
errors?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-export for backwards compatibility
|
||||||
|
export type { ProcessingProgress, ResponseHandler };
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Configuration for tool operations defining processing behavior and API integration
|
||||||
|
*/
|
||||||
|
export interface ToolOperationConfig<TParams = void> {
|
||||||
|
/** Operation identifier for tracking and logging */
|
||||||
|
operationType: string;
|
||||||
|
|
||||||
|
/** API endpoint for the operation (can be string or function for dynamic endpoints) */
|
||||||
|
endpoint: string | ((params: TParams) => string);
|
||||||
|
|
||||||
|
/** Builds FormData for API request - signature indicates single-file vs multi-file capability */
|
||||||
|
buildFormData: ((params: TParams, file: File) => FormData) | ((params: TParams, files: File[]) => FormData);
|
||||||
|
|
||||||
|
/** Prefix for processed filenames (e.g., 'compressed_', 'repaired_') */
|
||||||
|
filePrefix: string;
|
||||||
|
|
||||||
|
/** How to handle API responses */
|
||||||
|
responseHandler?: ResponseHandler;
|
||||||
|
|
||||||
|
/** Process files individually or as a batch */
|
||||||
|
singleFileMode?: boolean;
|
||||||
|
|
||||||
|
/** Custom processing logic that bypasses default file processing */
|
||||||
|
customProcessor?: (params: TParams, files: File[]) => Promise<File[]>;
|
||||||
|
|
||||||
|
/** Validate parameters before execution */
|
||||||
|
validateParams?: (params: TParams) => ValidationResult;
|
||||||
|
|
||||||
|
/** Extract user-friendly error messages */
|
||||||
|
getErrorMessage?: (error: any) => string;
|
||||||
|
|
||||||
|
/** Request timeout in milliseconds */
|
||||||
|
timeout?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Complete tool operation interface with execution capability
|
||||||
|
*/
|
||||||
|
export interface ToolOperationHook<TParams = void> {
|
||||||
|
// State
|
||||||
|
files: File[];
|
||||||
|
thumbnails: string[];
|
||||||
|
isGeneratingThumbnails: boolean;
|
||||||
|
downloadUrl: string | null;
|
||||||
|
downloadFilename: string;
|
||||||
|
isLoading: boolean;
|
||||||
|
status: string;
|
||||||
|
errorMessage: string | null;
|
||||||
|
progress: ProcessingProgress | null;
|
||||||
|
|
||||||
|
// Actions
|
||||||
|
executeOperation: (params: TParams, selectedFiles: File[]) => Promise<void>;
|
||||||
|
resetResults: () => void;
|
||||||
|
clearError: () => void;
|
||||||
|
cancelOperation: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-export for backwards compatibility
|
||||||
|
export { createStandardErrorHandler } from '../../../utils/toolErrorHandler';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Shared hook for tool operations with consistent error handling, progress tracking,
|
||||||
|
* and FileContext integration. Eliminates boilerplate while maintaining flexibility.
|
||||||
|
*/
|
||||||
|
export const useToolOperation = <TParams = void>(
|
||||||
|
config: ToolOperationConfig<TParams>
|
||||||
|
): ToolOperationHook<TParams> => {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const { recordOperation, markOperationApplied, markOperationFailed, addFiles } = useFileContext();
|
||||||
|
|
||||||
|
// Composed hooks
|
||||||
|
const { state, actions } = useToolState();
|
||||||
|
const { processFiles, cancelOperation: cancelApiCalls } = useToolApiCalls<TParams>();
|
||||||
|
const { generateThumbnails, createDownloadInfo, cleanupBlobUrls } = useToolResources();
|
||||||
|
|
||||||
|
const executeOperation = useCallback(async (
|
||||||
|
params: TParams,
|
||||||
|
selectedFiles: File[]
|
||||||
|
): Promise<void> => {
|
||||||
|
// Validation
|
||||||
|
if (selectedFiles.length === 0) {
|
||||||
|
actions.setError(t('noFileSelected', 'No files selected'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (config.validateParams) {
|
||||||
|
const validation = config.validateParams(params);
|
||||||
|
if (!validation.valid) {
|
||||||
|
actions.setError(validation.errors?.join(', ') || 'Invalid parameters');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const validFiles = selectedFiles.filter(file => file.size > 0);
|
||||||
|
if (validFiles.length === 0) {
|
||||||
|
actions.setError(t('noValidFiles', 'No valid files to process'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Setup operation tracking
|
||||||
|
const { operation, operationId, fileId } = createOperation(config.operationType, params, selectedFiles);
|
||||||
|
recordOperation(fileId, operation);
|
||||||
|
|
||||||
|
// Reset state
|
||||||
|
actions.setLoading(true);
|
||||||
|
actions.setError(null);
|
||||||
|
actions.resetResults();
|
||||||
|
cleanupBlobUrls();
|
||||||
|
|
||||||
|
try {
|
||||||
|
let processedFiles: File[];
|
||||||
|
|
||||||
|
if (config.customProcessor) {
|
||||||
|
actions.setStatus('Processing files...');
|
||||||
|
processedFiles = await config.customProcessor(params, validFiles);
|
||||||
|
} else {
|
||||||
|
// Detect if buildFormData signature is multi-file or single-file
|
||||||
|
// Both have 2 params now, so check if second param expects an array
|
||||||
|
const isMultiFileFormData = /files|selectedFiles/.test(config.buildFormData.toString());
|
||||||
|
|
||||||
|
if (isMultiFileFormData) {
|
||||||
|
// Multi-file processing - single API call with all files
|
||||||
|
actions.setStatus('Processing files...');
|
||||||
|
const formData = (config.buildFormData as (params: TParams, files: File[]) => FormData)(params, validFiles);
|
||||||
|
const endpoint = typeof config.endpoint === 'function' ? config.endpoint(params) : config.endpoint;
|
||||||
|
|
||||||
|
const response = await axios.post(endpoint, formData, { responseType: 'blob' });
|
||||||
|
|
||||||
|
// Handle response based on responseHandler
|
||||||
|
if (config.responseHandler?.type === 'zip' && config.responseHandler?.useZipExtractor) {
|
||||||
|
const zipFile = new File([response.data], 'results.zip', { type: 'application/zip' });
|
||||||
|
const { zipFileService } = await import('../../../services/zipFileService');
|
||||||
|
const extractionResult = await zipFileService.extractPdfFiles(zipFile);
|
||||||
|
processedFiles = extractionResult.success ? extractionResult.extractedFiles : [];
|
||||||
|
} else {
|
||||||
|
// Single file response
|
||||||
|
const filename = validFiles.length === 1
|
||||||
|
? `${config.filePrefix}${validFiles[0].name}`
|
||||||
|
: `${config.filePrefix}result.pdf`;
|
||||||
|
processedFiles = [new File([response.data], filename, { type: response.data.type })];
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Individual file processing - separate API call per file
|
||||||
|
const apiCallsConfig: ApiCallsConfig<TParams> = {
|
||||||
|
endpoint: config.endpoint,
|
||||||
|
buildFormData: (file: File, params: TParams) => (config.buildFormData as (params: TParams, file: File) => FormData)(params, file),
|
||||||
|
filePrefix: config.filePrefix,
|
||||||
|
responseHandler: config.responseHandler,
|
||||||
|
timeout: config.timeout
|
||||||
|
};
|
||||||
|
processedFiles = await processFiles(
|
||||||
|
params,
|
||||||
|
validFiles,
|
||||||
|
apiCallsConfig,
|
||||||
|
actions.setProgress,
|
||||||
|
actions.setStatus
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (processedFiles.length > 0) {
|
||||||
|
actions.setFiles(processedFiles);
|
||||||
|
|
||||||
|
// Generate thumbnails and download URL concurrently
|
||||||
|
actions.setGeneratingThumbnails(true);
|
||||||
|
const [thumbnails, downloadInfo] = await Promise.all([
|
||||||
|
generateThumbnails(processedFiles),
|
||||||
|
createDownloadInfo(processedFiles, config.operationType)
|
||||||
|
]);
|
||||||
|
actions.setGeneratingThumbnails(false);
|
||||||
|
|
||||||
|
actions.setThumbnails(thumbnails);
|
||||||
|
actions.setDownloadInfo(downloadInfo.url, downloadInfo.filename);
|
||||||
|
|
||||||
|
// Add to file context
|
||||||
|
await addFiles(processedFiles);
|
||||||
|
|
||||||
|
markOperationApplied(fileId, operationId);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error: any) {
|
||||||
|
const errorMessage = config.getErrorMessage?.(error) || extractErrorMessage(error);
|
||||||
|
actions.setError(errorMessage);
|
||||||
|
actions.setStatus('');
|
||||||
|
markOperationFailed(fileId, operationId, errorMessage);
|
||||||
|
} finally {
|
||||||
|
actions.setLoading(false);
|
||||||
|
actions.setProgress(null);
|
||||||
|
}
|
||||||
|
}, [t, config, actions, recordOperation, markOperationApplied, markOperationFailed, addFiles, processFiles, generateThumbnails, createDownloadInfo, cleanupBlobUrls]);
|
||||||
|
|
||||||
|
const cancelOperation = useCallback(() => {
|
||||||
|
cancelApiCalls();
|
||||||
|
actions.setLoading(false);
|
||||||
|
actions.setProgress(null);
|
||||||
|
actions.setStatus('Operation cancelled');
|
||||||
|
}, [cancelApiCalls, actions]);
|
||||||
|
|
||||||
|
const resetResults = useCallback(() => {
|
||||||
|
cleanupBlobUrls();
|
||||||
|
actions.resetResults();
|
||||||
|
}, [cleanupBlobUrls, actions]);
|
||||||
|
|
||||||
|
return {
|
||||||
|
// State
|
||||||
|
files: state.files,
|
||||||
|
thumbnails: state.thumbnails,
|
||||||
|
isGeneratingThumbnails: state.isGeneratingThumbnails,
|
||||||
|
downloadUrl: state.downloadUrl,
|
||||||
|
downloadFilename: state.downloadFilename,
|
||||||
|
isLoading: state.isLoading,
|
||||||
|
status: state.status,
|
||||||
|
errorMessage: state.errorMessage,
|
||||||
|
progress: state.progress,
|
||||||
|
|
||||||
|
// Actions
|
||||||
|
executeOperation,
|
||||||
|
resetResults,
|
||||||
|
clearError: actions.clearError,
|
||||||
|
cancelOperation
|
||||||
|
};
|
||||||
|
};
|
81
frontend/src/hooks/tools/shared/useToolResources.ts
Normal file
81
frontend/src/hooks/tools/shared/useToolResources.ts
Normal file
@ -0,0 +1,81 @@
|
|||||||
|
import { useState, useCallback, useEffect } from 'react';
|
||||||
|
import { generateThumbnailForFile } from '../../../utils/thumbnailUtils';
|
||||||
|
|
||||||
|
export const useToolResources = () => {
|
||||||
|
const [blobUrls, setBlobUrls] = useState<string[]>([]);
|
||||||
|
|
||||||
|
const addBlobUrl = useCallback((url: string) => {
|
||||||
|
setBlobUrls(prev => [...prev, url]);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const cleanupBlobUrls = useCallback(() => {
|
||||||
|
blobUrls.forEach(url => {
|
||||||
|
try {
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to revoke blob URL:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
setBlobUrls([]);
|
||||||
|
}, [blobUrls]);
|
||||||
|
|
||||||
|
// Cleanup on unmount
|
||||||
|
useEffect(() => {
|
||||||
|
return () => {
|
||||||
|
blobUrls.forEach(url => {
|
||||||
|
try {
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to revoke blob URL during cleanup:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}, [blobUrls]);
|
||||||
|
|
||||||
|
const generateThumbnails = useCallback(async (files: File[]): Promise<string[]> => {
|
||||||
|
const thumbnails: string[] = [];
|
||||||
|
|
||||||
|
for (const file of files) {
|
||||||
|
try {
|
||||||
|
const thumbnail = await generateThumbnailForFile(file);
|
||||||
|
thumbnails.push(thumbnail);
|
||||||
|
} catch (error) {
|
||||||
|
console.warn(`Failed to generate thumbnail for ${file.name}:`, error);
|
||||||
|
thumbnails.push('');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return thumbnails;
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const createDownloadInfo = useCallback(async (
|
||||||
|
files: File[],
|
||||||
|
operationType: string
|
||||||
|
): Promise<{ url: string; filename: string }> => {
|
||||||
|
if (files.length === 1) {
|
||||||
|
const url = URL.createObjectURL(files[0]);
|
||||||
|
addBlobUrl(url);
|
||||||
|
return { url, filename: files[0].name };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Multiple files - create zip
|
||||||
|
const JSZip = (await import('jszip')).default;
|
||||||
|
const zip = new JSZip();
|
||||||
|
|
||||||
|
files.forEach(file => {
|
||||||
|
zip.file(file.name, file);
|
||||||
|
});
|
||||||
|
|
||||||
|
const zipBlob = await zip.generateAsync({ type: 'blob' });
|
||||||
|
const url = URL.createObjectURL(zipBlob);
|
||||||
|
addBlobUrl(url);
|
||||||
|
|
||||||
|
return { url, filename: `${operationType}_results.zip` };
|
||||||
|
}, [addBlobUrl]);
|
||||||
|
|
||||||
|
return {
|
||||||
|
generateThumbnails,
|
||||||
|
createDownloadInfo,
|
||||||
|
cleanupBlobUrls,
|
||||||
|
};
|
||||||
|
};
|
137
frontend/src/hooks/tools/shared/useToolState.ts
Normal file
137
frontend/src/hooks/tools/shared/useToolState.ts
Normal file
@ -0,0 +1,137 @@
|
|||||||
|
import { useReducer, useCallback } from 'react';
|
||||||
|
|
||||||
|
export interface ProcessingProgress {
|
||||||
|
current: number;
|
||||||
|
total: number;
|
||||||
|
currentFileName?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface OperationState {
|
||||||
|
files: File[];
|
||||||
|
thumbnails: string[];
|
||||||
|
isGeneratingThumbnails: boolean;
|
||||||
|
downloadUrl: string | null;
|
||||||
|
downloadFilename: string;
|
||||||
|
isLoading: boolean;
|
||||||
|
status: string;
|
||||||
|
errorMessage: string | null;
|
||||||
|
progress: ProcessingProgress | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
type OperationAction =
|
||||||
|
| { type: 'SET_LOADING'; payload: boolean }
|
||||||
|
| { type: 'SET_FILES'; payload: File[] }
|
||||||
|
| { type: 'SET_THUMBNAILS'; payload: string[] }
|
||||||
|
| { type: 'SET_GENERATING_THUMBNAILS'; payload: boolean }
|
||||||
|
| { type: 'SET_DOWNLOAD_INFO'; payload: { url: string | null; filename: string } }
|
||||||
|
| { type: 'SET_STATUS'; payload: string }
|
||||||
|
| { type: 'SET_ERROR'; payload: string | null }
|
||||||
|
| { type: 'SET_PROGRESS'; payload: ProcessingProgress | null }
|
||||||
|
| { type: 'RESET_RESULTS' }
|
||||||
|
| { type: 'CLEAR_ERROR' };
|
||||||
|
|
||||||
|
const initialState: OperationState = {
|
||||||
|
files: [],
|
||||||
|
thumbnails: [],
|
||||||
|
isGeneratingThumbnails: false,
|
||||||
|
downloadUrl: null,
|
||||||
|
downloadFilename: '',
|
||||||
|
isLoading: false,
|
||||||
|
status: '',
|
||||||
|
errorMessage: null,
|
||||||
|
progress: null,
|
||||||
|
};
|
||||||
|
|
||||||
|
const operationReducer = (state: OperationState, action: OperationAction): OperationState => {
|
||||||
|
switch (action.type) {
|
||||||
|
case 'SET_LOADING':
|
||||||
|
return { ...state, isLoading: action.payload };
|
||||||
|
case 'SET_FILES':
|
||||||
|
return { ...state, files: action.payload };
|
||||||
|
case 'SET_THUMBNAILS':
|
||||||
|
return { ...state, thumbnails: action.payload };
|
||||||
|
case 'SET_GENERATING_THUMBNAILS':
|
||||||
|
return { ...state, isGeneratingThumbnails: action.payload };
|
||||||
|
case 'SET_DOWNLOAD_INFO':
|
||||||
|
return {
|
||||||
|
...state,
|
||||||
|
downloadUrl: action.payload.url,
|
||||||
|
downloadFilename: action.payload.filename
|
||||||
|
};
|
||||||
|
case 'SET_STATUS':
|
||||||
|
return { ...state, status: action.payload };
|
||||||
|
case 'SET_ERROR':
|
||||||
|
return { ...state, errorMessage: action.payload };
|
||||||
|
case 'SET_PROGRESS':
|
||||||
|
return { ...state, progress: action.payload };
|
||||||
|
case 'RESET_RESULTS':
|
||||||
|
return {
|
||||||
|
...initialState,
|
||||||
|
isLoading: state.isLoading, // Preserve loading state during reset
|
||||||
|
};
|
||||||
|
case 'CLEAR_ERROR':
|
||||||
|
return { ...state, errorMessage: null };
|
||||||
|
default:
|
||||||
|
return state;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useToolState = () => {
|
||||||
|
const [state, dispatch] = useReducer(operationReducer, initialState);
|
||||||
|
|
||||||
|
const setLoading = useCallback((loading: boolean) => {
|
||||||
|
dispatch({ type: 'SET_LOADING', payload: loading });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setFiles = useCallback((files: File[]) => {
|
||||||
|
dispatch({ type: 'SET_FILES', payload: files });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setThumbnails = useCallback((thumbnails: string[]) => {
|
||||||
|
dispatch({ type: 'SET_THUMBNAILS', payload: thumbnails });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setGeneratingThumbnails = useCallback((generating: boolean) => {
|
||||||
|
dispatch({ type: 'SET_GENERATING_THUMBNAILS', payload: generating });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setDownloadInfo = useCallback((url: string | null, filename: string) => {
|
||||||
|
dispatch({ type: 'SET_DOWNLOAD_INFO', payload: { url, filename } });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setStatus = useCallback((status: string) => {
|
||||||
|
dispatch({ type: 'SET_STATUS', payload: status });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setError = useCallback((error: string | null) => {
|
||||||
|
dispatch({ type: 'SET_ERROR', payload: error });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const setProgress = useCallback((progress: ProcessingProgress | null) => {
|
||||||
|
dispatch({ type: 'SET_PROGRESS', payload: progress });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const resetResults = useCallback(() => {
|
||||||
|
dispatch({ type: 'RESET_RESULTS' });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const clearError = useCallback(() => {
|
||||||
|
dispatch({ type: 'CLEAR_ERROR' });
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
state,
|
||||||
|
actions: {
|
||||||
|
setLoading,
|
||||||
|
setFiles,
|
||||||
|
setThumbnails,
|
||||||
|
setGeneratingThumbnails,
|
||||||
|
setDownloadInfo,
|
||||||
|
setStatus,
|
||||||
|
setError,
|
||||||
|
setProgress,
|
||||||
|
resetResults,
|
||||||
|
clearError,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
};
|
@ -1,242 +1,85 @@
|
|||||||
import { useCallback, useState } from 'react';
|
import { useCallback } from 'react';
|
||||||
import axios from 'axios';
|
import axios from 'axios';
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import { useFileContext } from '../../../contexts/FileContext';
|
import { useToolOperation, ToolOperationConfig } from '../shared/useToolOperation';
|
||||||
import { FileOperation } from '../../../types/fileContext';
|
import { createStandardErrorHandler } from '../../../utils/toolErrorHandler';
|
||||||
import { zipFileService } from '../../../services/zipFileService';
|
|
||||||
import { generateThumbnailForFile } from '../../../utils/thumbnailUtils';
|
|
||||||
import { SplitParameters } from '../../../components/tools/split/SplitSettings';
|
import { SplitParameters } from '../../../components/tools/split/SplitSettings';
|
||||||
import { SPLIT_MODES, ENDPOINTS, type SplitMode } from '../../../constants/splitConstants';
|
import { SPLIT_MODES } from '../../../constants/splitConstants';
|
||||||
|
|
||||||
export interface SplitOperationHook {
|
|
||||||
executeOperation: (
|
|
||||||
mode: SplitMode | '',
|
|
||||||
parameters: SplitParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => Promise<void>;
|
|
||||||
|
|
||||||
// Flattened result properties for cleaner access
|
|
||||||
files: File[];
|
|
||||||
thumbnails: string[];
|
|
||||||
isGeneratingThumbnails: boolean;
|
|
||||||
downloadUrl: string | null;
|
|
||||||
status: string;
|
|
||||||
errorMessage: string | null;
|
|
||||||
isLoading: boolean;
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults: () => void;
|
|
||||||
clearError: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
export const useSplitOperation = (): SplitOperationHook => {
|
const buildFormData = (parameters: SplitParameters, selectedFiles: File[]): FormData => {
|
||||||
|
const formData = new FormData();
|
||||||
|
|
||||||
|
selectedFiles.forEach(file => {
|
||||||
|
formData.append("fileInput", file);
|
||||||
|
});
|
||||||
|
|
||||||
|
switch (parameters.mode) {
|
||||||
|
case SPLIT_MODES.BY_PAGES:
|
||||||
|
formData.append("pageNumbers", parameters.pages);
|
||||||
|
break;
|
||||||
|
case SPLIT_MODES.BY_SECTIONS:
|
||||||
|
formData.append("horizontalDivisions", parameters.hDiv);
|
||||||
|
formData.append("verticalDivisions", parameters.vDiv);
|
||||||
|
formData.append("merge", parameters.merge.toString());
|
||||||
|
break;
|
||||||
|
case SPLIT_MODES.BY_SIZE_OR_COUNT:
|
||||||
|
formData.append(
|
||||||
|
"splitType",
|
||||||
|
parameters.splitType === "size" ? "0" : parameters.splitType === "pages" ? "1" : "2"
|
||||||
|
);
|
||||||
|
formData.append("splitValue", parameters.splitValue);
|
||||||
|
break;
|
||||||
|
case SPLIT_MODES.BY_CHAPTERS:
|
||||||
|
formData.append("bookmarkLevel", parameters.bookmarkLevel);
|
||||||
|
formData.append("includeMetadata", parameters.includeMetadata.toString());
|
||||||
|
formData.append("allowDuplicates", parameters.allowDuplicates.toString());
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
throw new Error(`Unknown split mode: ${parameters.mode}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return formData;
|
||||||
|
};
|
||||||
|
|
||||||
|
const getEndpoint = (parameters: SplitParameters): string => {
|
||||||
|
switch (parameters.mode) {
|
||||||
|
case SPLIT_MODES.BY_PAGES:
|
||||||
|
return "/api/v1/general/split-pages";
|
||||||
|
case SPLIT_MODES.BY_SECTIONS:
|
||||||
|
return "/api/v1/general/split-pdf-by-sections";
|
||||||
|
case SPLIT_MODES.BY_SIZE_OR_COUNT:
|
||||||
|
return "/api/v1/general/split-by-size-or-count";
|
||||||
|
case SPLIT_MODES.BY_CHAPTERS:
|
||||||
|
return "/api/v1/general/split-pdf-by-chapters";
|
||||||
|
default:
|
||||||
|
throw new Error(`Unknown split mode: ${parameters.mode}`);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useSplitOperation = () => {
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const {
|
|
||||||
recordOperation,
|
|
||||||
markOperationApplied,
|
|
||||||
markOperationFailed,
|
|
||||||
addFiles
|
|
||||||
} = useFileContext();
|
|
||||||
|
|
||||||
// Internal state management (replacing useOperationResults)
|
return useToolOperation<SplitParameters>({
|
||||||
const [files, setFiles] = useState<File[]>([]);
|
operationType: 'split',
|
||||||
const [thumbnails, setThumbnails] = useState<string[]>([]);
|
endpoint: (params) => getEndpoint(params),
|
||||||
const [isGeneratingThumbnails, setIsGeneratingThumbnails] = useState(false);
|
buildFormData: buildFormData, // Clean multi-file signature: (params, selectedFiles) => FormData
|
||||||
const [downloadUrl, setDownloadUrl] = useState<string | null>(null);
|
filePrefix: 'split_',
|
||||||
const [status, setStatus] = useState('');
|
responseHandler: {
|
||||||
const [errorMessage, setErrorMessage] = useState<string | null>(null);
|
type: 'zip',
|
||||||
const [isLoading, setIsLoading] = useState(false);
|
useZipExtractor: true
|
||||||
|
},
|
||||||
const buildFormData = useCallback((
|
validateParams: (params) => {
|
||||||
mode: SplitMode | '',
|
if (!params.mode) {
|
||||||
parameters: SplitParameters,
|
return { valid: false, errors: [t('split.validation.modeRequired', 'Split mode is required')] };
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
const formData = new FormData();
|
|
||||||
|
|
||||||
selectedFiles.forEach(file => {
|
|
||||||
formData.append("fileInput", file);
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!mode) {
|
|
||||||
throw new Error('Split mode is required');
|
|
||||||
}
|
|
||||||
|
|
||||||
let endpoint = "";
|
|
||||||
|
|
||||||
switch (mode) {
|
|
||||||
case SPLIT_MODES.BY_PAGES:
|
|
||||||
formData.append("pageNumbers", parameters.pages);
|
|
||||||
endpoint = "/api/v1/general/split-pages";
|
|
||||||
break;
|
|
||||||
case SPLIT_MODES.BY_SECTIONS:
|
|
||||||
formData.append("horizontalDivisions", parameters.hDiv);
|
|
||||||
formData.append("verticalDivisions", parameters.vDiv);
|
|
||||||
formData.append("merge", parameters.merge.toString());
|
|
||||||
endpoint = "/api/v1/general/split-pdf-by-sections";
|
|
||||||
break;
|
|
||||||
case SPLIT_MODES.BY_SIZE_OR_COUNT:
|
|
||||||
formData.append(
|
|
||||||
"splitType",
|
|
||||||
parameters.splitType === "size" ? "0" : parameters.splitType === "pages" ? "1" : "2"
|
|
||||||
);
|
|
||||||
formData.append("splitValue", parameters.splitValue);
|
|
||||||
endpoint = "/api/v1/general/split-by-size-or-count";
|
|
||||||
break;
|
|
||||||
case SPLIT_MODES.BY_CHAPTERS:
|
|
||||||
formData.append("bookmarkLevel", parameters.bookmarkLevel);
|
|
||||||
formData.append("includeMetadata", parameters.includeMetadata.toString());
|
|
||||||
formData.append("allowDuplicates", parameters.allowDuplicates.toString());
|
|
||||||
endpoint = "/api/v1/general/split-pdf-by-chapters";
|
|
||||||
break;
|
|
||||||
default:
|
|
||||||
throw new Error(`Unknown split mode: ${mode}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
return { formData, endpoint };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const createOperation = useCallback((
|
|
||||||
mode: SplitMode | '',
|
|
||||||
parameters: SplitParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
): { operation: FileOperation; operationId: string; fileId: string } => {
|
|
||||||
const operationId = `split-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
|
||||||
const fileId = selectedFiles[0].name;
|
|
||||||
|
|
||||||
const operation: FileOperation = {
|
|
||||||
id: operationId,
|
|
||||||
type: 'split',
|
|
||||||
timestamp: Date.now(),
|
|
||||||
fileIds: selectedFiles.map(f => f.name),
|
|
||||||
status: 'pending',
|
|
||||||
metadata: {
|
|
||||||
originalFileName: selectedFiles[0].name,
|
|
||||||
parameters: {
|
|
||||||
mode,
|
|
||||||
pages: mode === SPLIT_MODES.BY_PAGES ? parameters.pages : undefined,
|
|
||||||
hDiv: mode === SPLIT_MODES.BY_SECTIONS ? parameters.hDiv : undefined,
|
|
||||||
vDiv: mode === SPLIT_MODES.BY_SECTIONS ? parameters.vDiv : undefined,
|
|
||||||
merge: mode === SPLIT_MODES.BY_SECTIONS ? parameters.merge : undefined,
|
|
||||||
splitType: mode === SPLIT_MODES.BY_SIZE_OR_COUNT ? parameters.splitType : undefined,
|
|
||||||
splitValue: mode === SPLIT_MODES.BY_SIZE_OR_COUNT ? parameters.splitValue : undefined,
|
|
||||||
bookmarkLevel: mode === SPLIT_MODES.BY_CHAPTERS ? parameters.bookmarkLevel : undefined,
|
|
||||||
includeMetadata: mode === SPLIT_MODES.BY_CHAPTERS ? parameters.includeMetadata : undefined,
|
|
||||||
allowDuplicates: mode === SPLIT_MODES.BY_CHAPTERS ? parameters.allowDuplicates : undefined,
|
|
||||||
},
|
|
||||||
fileSize: selectedFiles[0].size
|
|
||||||
}
|
}
|
||||||
};
|
|
||||||
|
|
||||||
return { operation, operationId, fileId };
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const processResults = useCallback(async (blob: Blob) => {
|
|
||||||
try {
|
|
||||||
const zipFile = new File([blob], "split_result.zip", { type: "application/zip" });
|
|
||||||
const extractionResult = await zipFileService.extractPdfFiles(zipFile);
|
|
||||||
|
|
||||||
if (extractionResult.success && extractionResult.extractedFiles.length > 0) {
|
|
||||||
// Set local state for preview
|
|
||||||
setFiles(extractionResult.extractedFiles);
|
|
||||||
setThumbnails([]);
|
|
||||||
setIsGeneratingThumbnails(true);
|
|
||||||
|
|
||||||
// Add extracted files to FileContext for future use
|
|
||||||
await addFiles(extractionResult.extractedFiles);
|
|
||||||
|
|
||||||
const thumbnails = await Promise.all(
|
|
||||||
extractionResult.extractedFiles.map(async (file) => {
|
|
||||||
try {
|
|
||||||
return await generateThumbnailForFile(file);
|
|
||||||
} catch (error) {
|
|
||||||
console.warn(`Failed to generate thumbnail for ${file.name}:`, error);
|
|
||||||
return '';
|
|
||||||
}
|
|
||||||
})
|
|
||||||
);
|
|
||||||
|
|
||||||
setThumbnails(thumbnails);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
}
|
|
||||||
} catch (extractError) {
|
|
||||||
console.warn('Failed to extract files for preview:', extractError);
|
|
||||||
}
|
|
||||||
}, [addFiles]);
|
|
||||||
|
|
||||||
const executeOperation = useCallback(async (
|
|
||||||
mode: SplitMode | '',
|
|
||||||
parameters: SplitParameters,
|
|
||||||
selectedFiles: File[]
|
|
||||||
) => {
|
|
||||||
if (selectedFiles.length === 0) {
|
|
||||||
setStatus(t("noFileSelected"));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const { operation, operationId, fileId } = createOperation(mode, parameters, selectedFiles);
|
|
||||||
const { formData, endpoint } = buildFormData(mode, parameters, selectedFiles);
|
|
||||||
|
|
||||||
recordOperation(fileId, operation);
|
|
||||||
|
|
||||||
setStatus(t("loading"));
|
|
||||||
setIsLoading(true);
|
|
||||||
setErrorMessage(null);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await axios.post(endpoint, formData, { responseType: "blob" });
|
|
||||||
const blob = new Blob([response.data], { type: "application/zip" });
|
|
||||||
const url = window.URL.createObjectURL(blob);
|
|
||||||
|
|
||||||
setDownloadUrl(url);
|
if (params.mode === SPLIT_MODES.BY_PAGES && !params.pages) {
|
||||||
setStatus(t("downloadComplete"));
|
return { valid: false, errors: [t('split.validation.pagesRequired', 'Page numbers are required for split by pages')] };
|
||||||
|
|
||||||
await processResults(blob);
|
|
||||||
markOperationApplied(fileId, operationId);
|
|
||||||
} catch (error: any) {
|
|
||||||
console.error(error);
|
|
||||||
let errorMsg = t("error.pdfPassword", "An error occurred while splitting the PDF.");
|
|
||||||
if (error.response?.data && typeof error.response.data === 'string') {
|
|
||||||
errorMsg = error.response.data;
|
|
||||||
} else if (error.message) {
|
|
||||||
errorMsg = error.message;
|
|
||||||
}
|
}
|
||||||
setErrorMessage(errorMsg);
|
|
||||||
setStatus(t("error._value", "Split failed."));
|
return { valid: true };
|
||||||
markOperationFailed(fileId, operationId, errorMsg);
|
},
|
||||||
} finally {
|
getErrorMessage: createStandardErrorHandler(t('split.error.failed', 'An error occurred while splitting the PDF.'))
|
||||||
setIsLoading(false);
|
});
|
||||||
}
|
|
||||||
}, [t, createOperation, buildFormData, recordOperation, markOperationApplied, markOperationFailed, processResults]);
|
|
||||||
|
|
||||||
const resetResults = useCallback(() => {
|
|
||||||
setFiles([]);
|
|
||||||
setThumbnails([]);
|
|
||||||
setIsGeneratingThumbnails(false);
|
|
||||||
setDownloadUrl(null);
|
|
||||||
setStatus('');
|
|
||||||
setErrorMessage(null);
|
|
||||||
setIsLoading(false);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const clearError = useCallback(() => {
|
|
||||||
setErrorMessage(null);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return {
|
|
||||||
executeOperation,
|
|
||||||
|
|
||||||
// Flattened result properties for cleaner access
|
|
||||||
files,
|
|
||||||
thumbnails,
|
|
||||||
isGeneratingThumbnails,
|
|
||||||
downloadUrl,
|
|
||||||
status,
|
|
||||||
errorMessage,
|
|
||||||
isLoading,
|
|
||||||
|
|
||||||
// Result management functions
|
|
||||||
resetResults,
|
|
||||||
clearError,
|
|
||||||
};
|
|
||||||
};
|
};
|
@ -3,9 +3,7 @@ import { SPLIT_MODES, SPLIT_TYPES, ENDPOINTS, type SplitMode, type SplitType } f
|
|||||||
import { SplitParameters } from '../../../components/tools/split/SplitSettings';
|
import { SplitParameters } from '../../../components/tools/split/SplitSettings';
|
||||||
|
|
||||||
export interface SplitParametersHook {
|
export interface SplitParametersHook {
|
||||||
mode: SplitMode | '';
|
|
||||||
parameters: SplitParameters;
|
parameters: SplitParameters;
|
||||||
setMode: (mode: SplitMode | '') => void;
|
|
||||||
updateParameter: (parameter: keyof SplitParameters, value: string | boolean) => void;
|
updateParameter: (parameter: keyof SplitParameters, value: string | boolean) => void;
|
||||||
resetParameters: () => void;
|
resetParameters: () => void;
|
||||||
validateParameters: () => boolean;
|
validateParameters: () => boolean;
|
||||||
@ -13,6 +11,7 @@ export interface SplitParametersHook {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const initialParameters: SplitParameters = {
|
const initialParameters: SplitParameters = {
|
||||||
|
mode: '',
|
||||||
pages: '',
|
pages: '',
|
||||||
hDiv: '2',
|
hDiv: '2',
|
||||||
vDiv: '2',
|
vDiv: '2',
|
||||||
@ -25,7 +24,6 @@ const initialParameters: SplitParameters = {
|
|||||||
};
|
};
|
||||||
|
|
||||||
export const useSplitParameters = (): SplitParametersHook => {
|
export const useSplitParameters = (): SplitParametersHook => {
|
||||||
const [mode, setMode] = useState<SplitMode | ''>('');
|
|
||||||
const [parameters, setParameters] = useState<SplitParameters>(initialParameters);
|
const [parameters, setParameters] = useState<SplitParameters>(initialParameters);
|
||||||
|
|
||||||
const updateParameter = (parameter: keyof SplitParameters, value: string | boolean) => {
|
const updateParameter = (parameter: keyof SplitParameters, value: string | boolean) => {
|
||||||
@ -34,13 +32,12 @@ export const useSplitParameters = (): SplitParametersHook => {
|
|||||||
|
|
||||||
const resetParameters = () => {
|
const resetParameters = () => {
|
||||||
setParameters(initialParameters);
|
setParameters(initialParameters);
|
||||||
setMode('');
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const validateParameters = () => {
|
const validateParameters = () => {
|
||||||
if (!mode) return false;
|
if (!parameters.mode) return false;
|
||||||
|
|
||||||
switch (mode) {
|
switch (parameters.mode) {
|
||||||
case SPLIT_MODES.BY_PAGES:
|
case SPLIT_MODES.BY_PAGES:
|
||||||
return parameters.pages.trim() !== "";
|
return parameters.pages.trim() !== "";
|
||||||
case SPLIT_MODES.BY_SECTIONS:
|
case SPLIT_MODES.BY_SECTIONS:
|
||||||
@ -55,14 +52,12 @@ export const useSplitParameters = (): SplitParametersHook => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
const getEndpointName = () => {
|
const getEndpointName = () => {
|
||||||
if (!mode) return ENDPOINTS[SPLIT_MODES.BY_PAGES];
|
if (!parameters.mode) return ENDPOINTS[SPLIT_MODES.BY_PAGES];
|
||||||
return ENDPOINTS[mode as SplitMode];
|
return ENDPOINTS[parameters.mode as SplitMode];
|
||||||
};
|
};
|
||||||
|
|
||||||
return {
|
return {
|
||||||
mode,
|
|
||||||
parameters,
|
parameters,
|
||||||
setMode,
|
|
||||||
updateParameter,
|
updateParameter,
|
||||||
resetParameters,
|
resetParameters,
|
||||||
validateParameters,
|
validateParameters,
|
||||||
|
@ -33,12 +33,11 @@ const Split = ({ onPreviewFile, onComplete, onError }: BaseToolProps) => {
|
|||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
splitOperation.resetResults();
|
splitOperation.resetResults();
|
||||||
onPreviewFile?.(null);
|
onPreviewFile?.(null);
|
||||||
}, [splitParams.mode, splitParams.parameters, selectedFiles]);
|
}, [splitParams.parameters, selectedFiles]);
|
||||||
|
|
||||||
const handleSplit = async () => {
|
const handleSplit = async () => {
|
||||||
try {
|
try {
|
||||||
await splitOperation.executeOperation(
|
await splitOperation.executeOperation(
|
||||||
splitParams.mode,
|
|
||||||
splitParams.parameters,
|
splitParams.parameters,
|
||||||
selectedFiles
|
selectedFiles
|
||||||
);
|
);
|
||||||
@ -105,14 +104,12 @@ const Split = ({ onPreviewFile, onComplete, onError }: BaseToolProps) => {
|
|||||||
>
|
>
|
||||||
<Stack gap="sm">
|
<Stack gap="sm">
|
||||||
<SplitSettings
|
<SplitSettings
|
||||||
mode={splitParams.mode}
|
|
||||||
onModeChange={splitParams.setMode}
|
|
||||||
parameters={splitParams.parameters}
|
parameters={splitParams.parameters}
|
||||||
onParameterChange={splitParams.updateParameter}
|
onParameterChange={splitParams.updateParameter}
|
||||||
disabled={endpointLoading}
|
disabled={endpointLoading}
|
||||||
/>
|
/>
|
||||||
|
|
||||||
{splitParams.mode && (
|
{splitParams.parameters.mode && (
|
||||||
<OperationButton
|
<OperationButton
|
||||||
onClick={handleSplit}
|
onClick={handleSplit}
|
||||||
isLoading={splitOperation.isLoading}
|
isLoading={splitOperation.isLoading}
|
||||||
|
33
frontend/src/utils/toolErrorHandler.ts
Normal file
33
frontend/src/utils/toolErrorHandler.ts
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
/**
|
||||||
|
* Standardized error handling utilities for tool operations
|
||||||
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Default error extractor that follows the standard pattern
|
||||||
|
*/
|
||||||
|
export const extractErrorMessage = (error: any): string => {
|
||||||
|
if (error.response?.data && typeof error.response.data === 'string') {
|
||||||
|
return error.response.data;
|
||||||
|
}
|
||||||
|
if (error.message) {
|
||||||
|
return error.message;
|
||||||
|
}
|
||||||
|
return 'Operation failed';
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a standardized error handler for tool operations
|
||||||
|
* @param fallbackMessage - Message to show when no specific error can be extracted
|
||||||
|
* @returns Error handler function that follows the standard pattern
|
||||||
|
*/
|
||||||
|
export const createStandardErrorHandler = (fallbackMessage: string) => {
|
||||||
|
return (error: any): string => {
|
||||||
|
if (error.response?.data && typeof error.response.data === 'string') {
|
||||||
|
return error.response.data;
|
||||||
|
}
|
||||||
|
if (error.message) {
|
||||||
|
return error.message;
|
||||||
|
}
|
||||||
|
return fallbackMessage;
|
||||||
|
};
|
||||||
|
};
|
28
frontend/src/utils/toolOperationTracker.ts
Normal file
28
frontend/src/utils/toolOperationTracker.ts
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
import { FileOperation } from '../types/fileContext';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates operation tracking data for FileContext integration
|
||||||
|
*/
|
||||||
|
export const createOperation = <TParams = void>(
|
||||||
|
operationType: string,
|
||||||
|
params: TParams,
|
||||||
|
selectedFiles: File[]
|
||||||
|
): { operation: FileOperation; operationId: string; fileId: string } => {
|
||||||
|
const operationId = `${operationType}-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||||
|
const fileId = selectedFiles.map(f => f.name).join(',');
|
||||||
|
|
||||||
|
const operation: FileOperation = {
|
||||||
|
id: operationId,
|
||||||
|
type: operationType,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
fileIds: selectedFiles.map(f => f.name),
|
||||||
|
status: 'pending',
|
||||||
|
metadata: {
|
||||||
|
originalFileName: selectedFiles[0]?.name,
|
||||||
|
parameters: params,
|
||||||
|
fileSize: selectedFiles.reduce((sum, f) => sum + f.size, 0)
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return { operation, operationId, fileId };
|
||||||
|
};
|
45
frontend/src/utils/toolResponseProcessor.ts
Normal file
45
frontend/src/utils/toolResponseProcessor.ts
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
import { zipFileService } from '../services/zipFileService';
|
||||||
|
|
||||||
|
export interface ResponseHandler {
|
||||||
|
type: 'single' | 'zip' | 'custom';
|
||||||
|
processor?: (blob: Blob) => Promise<File[]>;
|
||||||
|
useZipExtractor?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
const defaultResponseHandler: ResponseHandler = {
|
||||||
|
type: 'single'
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Processes API response blob based on handler configuration
|
||||||
|
*/
|
||||||
|
export const processResponse = async (
|
||||||
|
blob: Blob,
|
||||||
|
originalFiles: File[],
|
||||||
|
filePrefix: string,
|
||||||
|
responseHandler?: ResponseHandler
|
||||||
|
): Promise<File[]> => {
|
||||||
|
const handler = responseHandler || defaultResponseHandler;
|
||||||
|
|
||||||
|
switch (handler.type) {
|
||||||
|
case 'zip':
|
||||||
|
if (handler.useZipExtractor) {
|
||||||
|
const zipFile = new File([blob], 'result.zip', { type: 'application/zip' });
|
||||||
|
const extractionResult = await zipFileService.extractPdfFiles(zipFile);
|
||||||
|
return extractionResult.success ? extractionResult.extractedFiles : [];
|
||||||
|
}
|
||||||
|
// Fall through to custom if no zip extractor
|
||||||
|
case 'custom':
|
||||||
|
if (handler.processor) {
|
||||||
|
return await handler.processor(blob);
|
||||||
|
}
|
||||||
|
// Fall through to single
|
||||||
|
case 'single':
|
||||||
|
default:
|
||||||
|
const contentType = blob.type || 'application/pdf';
|
||||||
|
const filename = originalFiles.length === 1
|
||||||
|
? `${filePrefix}${originalFiles[0].name}`
|
||||||
|
: `${filePrefix}result.pdf`;
|
||||||
|
return [new File([blob], filename, { type: contentType })];
|
||||||
|
}
|
||||||
|
};
|
Loading…
Reference in New Issue
Block a user