Back to Blog
technical security file-uploads

Form File Uploads: Best Practices & Security

Pixelform Team July 6, 2025

Key Takeaways

  • Secure file uploads require client-side and server-side validation with strict file type restrictions
  • Progressive upload with chunking handles large files reliably and provides better user experience
  • Virus scanning and content verification protect against malicious uploads targeting your infrastructure
  • Cloud storage integration with signed URLs provides scalable, secure file handling

File upload fields transform simple forms into powerful data collection tools. Resume submissions, document attachments, image uploads, and supporting materials all require file upload functionality.

However, file uploads also introduce security risks and technical complexity. This guide covers implementation best practices from validation to secure storage.

File upload architecture showing validation, storage, and delivery

File Upload Field Configuration

Configure upload fields with appropriate restrictions for your use case.

Basic Configuration

{
  "type": "file",
  "name": "resume",
  "label": "Upload Resume",
  "required": true,
  "config": {
    "maxFiles": 1,
    "maxSize": "10MB",
    "allowedTypes": [
      "application/pdf",
      "application/msword",
      "application/vnd.openxmlformats-officedocument.wordprocessingml.document"
    ],
    "allowedExtensions": [".pdf", ".doc", ".docx"]
  }
}

Multiple File Uploads

{
  "type": "file",
  "name": "supporting_documents",
  "label": "Supporting Documents",
  "config": {
    "multiple": true,
    "maxFiles": 5,
    "maxTotalSize": "50MB",
    "maxSize": "15MB",
    "allowedTypes": ["application/pdf", "image/*"]
  }
}

Image-Specific Configuration

{
  "type": "file",
  "name": "profile_photo",
  "label": "Profile Photo",
  "config": {
    "accept": "image/*",
    "maxSize": "5MB",
    "allowedTypes": ["image/jpeg", "image/png", "image/webp"],
    "imageValidation": {
      "minWidth": 200,
      "minHeight": 200,
      "maxWidth": 4000,
      "maxHeight": 4000,
      "aspectRatio": "1:1",
      "aspectRatioTolerance": 0.1
    },
    "preview": true,
    "crop": {
      "enabled": true,
      "aspectRatio": 1
    }
  }
}

Client-Side Validation

Client-side file validation process

Validate files before upload to provide instant feedback and reduce server load.

File Type Validation

function validateFileType(file, allowedTypes) {
  // Check MIME type
  if (!allowedTypes.includes(file.type)) {
    return {
      valid: false,
      error: `File type ${file.type} is not allowed. Accepted types: ${allowedTypes.join(', ')}`
    };
  }

  // Also check extension as fallback
  const extension = '.' + file.name.split('.').pop().toLowerCase();
  const allowedExtensions = ['.pdf', '.doc', '.docx', '.jpg', '.png'];

  if (!allowedExtensions.includes(extension)) {
    return {
      valid: false,
      error: `File extension ${extension} is not allowed`
    };
  }

  return { valid: true };
}

File Size Validation

function validateFileSize(file, maxSizeBytes) {
  if (file.size > maxSizeBytes) {
    const maxSizeMB = (maxSizeBytes / (1024 * 1024)).toFixed(1);
    const fileSizeMB = (file.size / (1024 * 1024)).toFixed(1);

    return {
      valid: false,
      error: `File size (${fileSizeMB}MB) exceeds the ${maxSizeMB}MB limit`
    };
  }

  return { valid: true };
}

Image Dimension Validation

async function validateImageDimensions(file, config) {
  return new Promise((resolve) => {
    const img = new Image();
    img.onload = () => {
      URL.revokeObjectURL(img.src);

      const { width, height } = img;

      if (config.minWidth && width < config.minWidth) {
        resolve({
          valid: false,
          error: `Image width (${width}px) is less than minimum (${config.minWidth}px)`
        });
        return;
      }

      if (config.minHeight && height < config.minHeight) {
        resolve({
          valid: false,
          error: `Image height (${height}px) is less than minimum (${config.minHeight}px)`
        });
        return;
      }

      if (config.aspectRatio) {
        const [ratioW, ratioH] = config.aspectRatio.split(':').map(Number);
        const expectedRatio = ratioW / ratioH;
        const actualRatio = width / height;
        const tolerance = config.aspectRatioTolerance || 0.05;

        if (Math.abs(actualRatio - expectedRatio) > tolerance) {
          resolve({
            valid: false,
            error: `Image aspect ratio should be ${config.aspectRatio}`
          });
          return;
        }
      }

      resolve({ valid: true, dimensions: { width, height } });
    };

    img.onerror = () => {
      resolve({ valid: false, error: 'Failed to load image for validation' });
    };

    img.src = URL.createObjectURL(file);
  });
}

Complete Client Validation

async function validateUpload(files, config) {
  const errors = [];
  const validFiles = [];

  // Check total file count
  if (files.length > config.maxFiles) {
    return {
      valid: false,
      errors: [`Maximum ${config.maxFiles} files allowed`]
    };
  }

  // Check total size
  const totalSize = files.reduce((sum, f) => sum + f.size, 0);
  if (config.maxTotalSize && totalSize > config.maxTotalSize) {
    return {
      valid: false,
      errors: [`Total upload size exceeds ${formatBytes(config.maxTotalSize)}`]
    };
  }

  // Validate each file
  for (const file of files) {
    const typeResult = validateFileType(file, config.allowedTypes);
    if (!typeResult.valid) {
      errors.push(`${file.name}: ${typeResult.error}`);
      continue;
    }

    const sizeResult = validateFileSize(file, config.maxSize);
    if (!sizeResult.valid) {
      errors.push(`${file.name}: ${sizeResult.error}`);
      continue;
    }

    if (file.type.startsWith('image/') && config.imageValidation) {
      const dimResult = await validateImageDimensions(file, config.imageValidation);
      if (!dimResult.valid) {
        errors.push(`${file.name}: ${dimResult.error}`);
        continue;
      }
    }

    validFiles.push(file);
  }

  return {
    valid: errors.length === 0,
    errors,
    validFiles
  };
}

Server-Side Security

Never trust client-side validation alone. Implement comprehensive server-side checks.

Server-side security validation flow

File Type Verification

Check actual file content, not just extension or MIME type:

const fileType = require('file-type');
const readChunk = require('read-chunk');

async function verifyFileType(filePath, expectedTypes) {
  // Read first bytes to detect actual type
  const buffer = await readChunk(filePath, { length: 4100, startPosition: 0 });
  const detected = await fileType.fromBuffer(buffer);

  if (!detected) {
    // Fallback for text-based files (HTML, SVG)
    return verifyTextFile(filePath, expectedTypes);
  }

  const allowed = expectedTypes.some(type => {
    if (type.endsWith('/*')) {
      return detected.mime.startsWith(type.replace('/*', '/'));
    }
    return detected.mime === type;
  });

  if (!allowed) {
    throw new Error(`File type ${detected.mime} is not allowed`);
  }

  return detected;
}

Virus Scanning

Scan uploads for malware before storage:

const NodeClam = require('clamscan');

async function scanForViruses(filePath) {
  const clamscan = await new NodeClam().init({
    clamdscan: {
      host: 'localhost',
      port: 3310
    }
  });

  const { isInfected, viruses } = await clamscan.scanFile(filePath);

  if (isInfected) {
    // Delete infected file immediately
    await fs.unlink(filePath);

    throw new Error(`File is infected with: ${viruses.join(', ')}`);
  }

  return { clean: true };
}

Content Sanitization

Sanitize files that could contain executable content:

const DOMPurify = require('dompurify');
const { JSDOM } = require('jsdom');

async function sanitizeSVG(filePath) {
  const content = await fs.readFile(filePath, 'utf8');

  const window = new JSDOM('').window;
  const purify = DOMPurify(window);

  const clean = purify.sanitize(content, {
    USE_PROFILES: { svg: true },
    ADD_TAGS: ['use'],
    FORBID_TAGS: ['script', 'foreignObject'],
    FORBID_ATTR: ['onclick', 'onerror', 'onload']
  });

  await fs.writeFile(filePath, clean);
  return clean;
}

async function sanitizeImage(filePath) {
  // Re-encode images to strip metadata and potential exploits
  const sharp = require('sharp');

  const buffer = await sharp(filePath)
    .rotate() // Apply EXIF rotation
    .toBuffer();

  await sharp(buffer)
    .jpeg({ quality: 90 })
    .toFile(filePath.replace(/\.[^.]+$/, '.jpg'));
}

Filename Sanitization

function sanitizeFilename(filename) {
  // Remove path traversal attempts
  let safe = filename.replace(/[/\\]/g, '');

  // Remove null bytes
  safe = safe.replace(/\0/g, '');

  // Replace problematic characters
  safe = safe.replace(/[<>:"|?*]/g, '_');

  // Limit length
  const ext = safe.split('.').pop();
  const name = safe.slice(0, -(ext.length + 1));
  safe = name.slice(0, 200) + '.' + ext;

  // Generate unique name
  const uniqueId = crypto.randomBytes(8).toString('hex');
  safe = `${uniqueId}-${safe}`;

  return safe;
}

Progressive Upload Implementation

Handle large files with chunked uploads and progress feedback.

Chunked Upload Client

class ChunkedUploader {
  constructor(file, options = {}) {
    this.file = file;
    this.chunkSize = options.chunkSize || 5 * 1024 * 1024; // 5MB chunks
    this.uploadUrl = options.uploadUrl;
    this.onProgress = options.onProgress || (() => {});
    this.chunks = Math.ceil(file.size / this.chunkSize);
    this.uploadedChunks = 0;
  }

  async upload() {
    // Initialize upload session
    const { uploadId } = await this.initUpload();

    // Upload chunks
    for (let i = 0; i < this.chunks; i++) {
      await this.uploadChunk(uploadId, i);
      this.uploadedChunks++;
      this.onProgress({
        chunk: i + 1,
        totalChunks: this.chunks,
        percentage: Math.round((this.uploadedChunks / this.chunks) * 100)
      });
    }

    // Complete upload
    return await this.completeUpload(uploadId);
  }

  async initUpload() {
    const response = await fetch(`${this.uploadUrl}/init`, {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        filename: this.file.name,
        size: this.file.size,
        type: this.file.type,
        chunks: this.chunks
      })
    });

    return response.json();
  }

  async uploadChunk(uploadId, chunkIndex) {
    const start = chunkIndex * this.chunkSize;
    const end = Math.min(start + this.chunkSize, this.file.size);
    const chunk = this.file.slice(start, end);

    const formData = new FormData();
    formData.append('chunk', chunk);
    formData.append('index', chunkIndex);

    const response = await fetch(`${this.uploadUrl}/${uploadId}/chunk`, {
      method: 'POST',
      body: formData
    });

    if (!response.ok) {
      throw new Error(`Chunk ${chunkIndex} upload failed`);
    }
  }

  async completeUpload(uploadId) {
    const response = await fetch(`${this.uploadUrl}/${uploadId}/complete`, {
      method: 'POST'
    });

    return response.json();
  }
}

// Usage
const uploader = new ChunkedUploader(file, {
  uploadUrl: '/api/upload',
  onProgress: (progress) => {
    progressBar.style.width = `${progress.percentage}%`;
    progressText.textContent = `Uploading: ${progress.percentage}%`;
  }
});

const result = await uploader.upload();

Upload Progress UI

<div class="upload-container">
  <div class="dropzone" id="dropzone">
    <input type="file" id="fileInput" hidden>
    <p>Drop files here or click to select</p>
  </div>

  <div class="upload-progress" id="progress" hidden>
    <div class="file-info">
      <span class="filename"></span>
      <span class="filesize"></span>
    </div>
    <div class="progress-bar">
      <div class="progress-fill"></div>
    </div>
    <span class="progress-text">0%</span>
    <button class="cancel-btn">Cancel</button>
  </div>

  <div class="upload-complete" id="complete" hidden>
    <span class="success-icon">Check</span>
    <span>Upload complete</span>
  </div>
</div>
.dropzone {
  border: 2px dashed #d1d5db;
  border-radius: 12px;
  padding: 48px;
  text-align: center;
  cursor: pointer;
  transition: all 0.2s;
}

.dropzone:hover,
.dropzone.dragover {
  border-color: #6366f1;
  background: #f5f3ff;
}

.progress-bar {
  width: 100%;
  height: 8px;
  background: #e5e7eb;
  border-radius: 4px;
  overflow: hidden;
}

.progress-fill {
  height: 100%;
  background: linear-gradient(90deg, #6366f1, #8b5cf6);
  transition: width 0.3s ease;
}

Cloud Storage Integration

Cloud storage integration flow

AWS S3 Direct Upload

Upload directly to S3 using presigned URLs:

// Server: Generate presigned URL
const AWS = require('aws-sdk');
const s3 = new AWS.S3();

async function getPresignedUploadUrl(filename, contentType) {
  const key = `uploads/${Date.now()}-${sanitizeFilename(filename)}`;

  const params = {
    Bucket: process.env.S3_BUCKET,
    Key: key,
    ContentType: contentType,
    Expires: 300 // 5 minutes
  };

  const url = await s3.getSignedUrlPromise('putObject', params);

  return { url, key };
}

// Client: Upload to presigned URL
async function uploadToS3(file) {
  // Get presigned URL from your server
  const { url, key } = await fetch('/api/upload/presign', {
    method: 'POST',
    body: JSON.stringify({ filename: file.name, type: file.type })
  }).then(r => r.json());

  // Upload directly to S3
  await fetch(url, {
    method: 'PUT',
    body: file,
    headers: { 'Content-Type': file.type }
  });

  return key;
}

Google Cloud Storage

const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket(process.env.GCS_BUCKET);

async function getSignedUploadUrl(filename, contentType) {
  const file = bucket.file(`uploads/${Date.now()}-${filename}`);

  const [url] = await file.getSignedUrl({
    version: 'v4',
    action: 'write',
    expires: Date.now() + 5 * 60 * 1000,
    contentType
  });

  return { url, path: file.name };
}

Cloudflare R2

const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');

const r2 = new S3Client({
  region: 'auto',
  endpoint: `https://${process.env.CF_ACCOUNT_ID}.r2.cloudflarestorage.com`,
  credentials: {
    accessKeyId: process.env.R2_ACCESS_KEY,
    secretAccessKey: process.env.R2_SECRET_KEY
  }
});

async function getR2PresignedUrl(filename, contentType) {
  const key = `uploads/${Date.now()}-${filename}`;

  const command = new PutObjectCommand({
    Bucket: process.env.R2_BUCKET,
    Key: key,
    ContentType: contentType
  });

  const url = await getSignedUrl(r2, command, { expiresIn: 300 });

  return { url, key };
}

Image Processing

Optimize uploaded images for web delivery.

const sharp = require('sharp');

async function processImage(inputPath, options = {}) {
  const {
    maxWidth = 2000,
    maxHeight = 2000,
    quality = 85,
    format = 'webp'
  } = options;

  const image = sharp(inputPath);
  const metadata = await image.metadata();

  let pipeline = image;

  // Resize if needed
  if (metadata.width > maxWidth || metadata.height > maxHeight) {
    pipeline = pipeline.resize(maxWidth, maxHeight, {
      fit: 'inside',
      withoutEnlargement: true
    });
  }

  // Apply format-specific optimizations
  switch (format) {
    case 'webp':
      pipeline = pipeline.webp({ quality });
      break;
    case 'jpeg':
      pipeline = pipeline.jpeg({ quality, mozjpeg: true });
      break;
    case 'png':
      pipeline = pipeline.png({ compressionLevel: 9 });
      break;
    case 'avif':
      pipeline = pipeline.avif({ quality });
      break;
  }

  // Generate multiple sizes for responsive images
  const sizes = [320, 640, 1024, 1920];
  const variants = {};

  for (const width of sizes) {
    if (width < metadata.width) {
      const outputPath = inputPath.replace(/\.[^.]+$/, `-${width}.${format}`);
      await pipeline.clone()
        .resize(width)
        .toFile(outputPath);
      variants[width] = outputPath;
    }
  }

  return { original: inputPath, variants };
}

Thumbnail Generation

async function generateThumbnail(inputPath, size = 200) {
  const outputPath = inputPath.replace(/\.[^.]+$/, '-thumb.webp');

  await sharp(inputPath)
    .resize(size, size, {
      fit: 'cover',
      position: 'centre'
    })
    .webp({ quality: 80 })
    .toFile(outputPath);

  return outputPath;
}

FAQ

What file types should I allow for resume uploads?

For resume uploads, allow PDF, Microsoft Word (.doc, .docx), and optionally plain text (.txt) and rich text (.rtf) formats. PDF is recommended as the primary format because it preserves formatting across all devices. Limit file size to 5-10MB. Always verify the actual file content server-side, not just the extension.

How do I handle file uploads that fail midway?

Implement resumable uploads using chunked upload protocols. Store upload progress server-side with a unique upload ID. When a chunk fails, retry that specific chunk rather than the entire file. Allow users to resume incomplete uploads within a reasonable time window (24-48 hours). Automatically clean up abandoned partial uploads.

What is the maximum file size I should allow?

Maximum file size depends on your use case and infrastructure. For document forms, 10-25MB covers most needs. For image uploads, 5-10MB works well with proper compression. For video or large media, consider 100MB-1GB with chunked uploads. Set conservative limits initially and increase based on actual user needs and server capacity.

How do I protect against malicious file uploads?

Implement multiple layers of protection: validate file type by content (not just extension), scan for viruses using ClamAV or cloud scanning services, sanitize files that could contain scripts (SVGs, Office documents), store uploads outside the web root, generate new filenames to prevent overwrites, and serve files through a CDN with proper content-type headers.

Should I store uploaded files locally or in cloud storage?

Cloud storage (S3, GCS, R2) is recommended for production applications. Benefits include unlimited scalability, built-in redundancy, CDN integration, and reduced server load. Use presigned URLs for direct uploads to bypass your server entirely. Local storage works for development and small-scale deployments but requires manual backup and scaling solutions.

Secure File Uploads Made Simple

Pixelform handles file upload security, storage, and processing automatically. Upload files to any form with validation, virus scanning, and cloud storage included.

Add file uploads to your forms with enterprise-grade security built in.

Related Articles