← Back to Blog

How Much Quality Degradation Occurs with Lossy Compression?

Published: July 12, 2025
Tags: python, png, jpeg, images

Introduction

Have you ever heard that "repeatedly saving a JPEG image causes quality degradation"? But exactly how many iterations does it take, and how much degradation occurs? This article explores the limits of image degradation by repeatedly recompressing JPEG images to observe quality deterioration.

System Specifications

MacBook Air M2 arm64

Background Knowledge

Lossless Compression

When compressed data can be perfectly restored to its original state during decompression, it's called lossless compression. There's no quality degradation, and the data can be completely recovered.

Lossy Compression

When some data is lost during compression and cannot be perfectly restored during decompression, it's called lossy compression. The data cannot be returned to its exact original state.

Experiment Overview

Compression Cycle Process:
Original Image → JPEG Compress → Decompress → JPEG Compress → Decompress → ...

Each cycle introduces cumulative quality loss

Experimental Setup

Python Environment and Libraries

import numpy as np
from PIL import Image, ImageDraw
import matplotlib.pyplot as plt
import os

# Create test image
def create_test_image():
    img = Image.new('RGB', (800, 600), 'white')
    draw = ImageDraw.Draw(img)
    
    # Add various elements to test compression
    draw.rectangle([100, 100, 300, 200], fill='red')
    draw.ellipse([400, 150, 600, 350], fill='blue')
    draw.line([0, 0, 800, 600], fill='black', width=3)
    
    return img

Compression Test Function

def compression_test(image, quality, iterations):
    """
    Repeatedly compress and decompress JPEG image
    """
    current_image = image.copy()
    file_sizes = []
    
    for i in range(iterations):
        # Save as JPEG with specified quality
        temp_path = f'temp_iteration_{i}.jpg'
        current_image.save(temp_path, 'JPEG', quality=quality)
        
        # Record file size
        file_sizes.append(os.path.getsize(temp_path))
        
        # Reload the compressed image
        current_image = Image.open(temp_path)
        
        # Clean up temporary file
        os.remove(temp_path)
        
        if i % 10 == 0:
            current_image.save(f'result_iteration_{i}.jpg', 'JPEG', quality=quality)
    
    return current_image, file_sizes

Experimental Results

Quality Level Analysis

Testing with different JPEG quality settings:

High Quality (95%)

Medium Quality (75%)

Low Quality (50%)

File Size Behavior

def analyze_file_sizes(file_sizes):
    plt.figure(figsize=(10, 6))
    plt.plot(range(len(file_sizes)), file_sizes, 'b-', linewidth=2)
    plt.title('File Size vs Compression Iterations')
    plt.xlabel('Iteration Number')
    plt.ylabel('File Size (bytes)')
    plt.grid(True)
    plt.show()
    
    # File sizes typically stabilize after initial compressions

Key Observations

Degradation Patterns

  1. Initial Drop: Largest quality loss occurs in first few iterations
  2. Stabilization: Quality loss rate decreases over time
  3. Plateau Effect: Eventually reaches a degraded but stable state

DCT Block Artifacts

JPEG's 8x8 DCT (Discrete Cosine Transform) blocks become increasingly visible:

Frequency Domain Analysis

Higher frequency details are lost first:

Practical Implications

Real-World Scenarios

Best Practices

Conclusion

The experiment demonstrates that JPEG quality degradation follows a predictable pattern: rapid initial quality loss followed by stabilization. While high-quality settings (90%+) can withstand many recompression cycles, lower quality settings quickly become unusable.

Understanding these characteristics helps in making informed decisions about image storage and processing workflows. The key takeaway is to always preserve lossless originals and use JPEG compression judiciously in production workflows.