GoF Creational Pattern

Prototype Pattern

Specify the kinds of objects to create using a prototypical instance, and create new objects by copying this prototype — GoF, 1994

29 Q&As 6 Bug Studies 10 Pitfalls 4 Testing Strategies C# / .NET
Section 1

TL;DR

The .NET Prototype story is messy. ICloneableSystem.ICloneable — an interface with a single Clone() method that returns object. Introduced in .NET 1.0, it has been officially discouraged since 2005 (Framework Design Guidelines, 1st edition) because it doesn't specify whether the clone is deep or shallow. Microsoft's own guidelines say "Do not implement ICloneable." has been in the BCL since day one, but Microsoft themselves say don't use it. MemberwiseClone()A protected method on System.Object that creates a shallow copy by copying all fields. Value types get copied by value, reference types get copied by reference — meaning two objects share the same nested objects. It's a native CLR call, so it's extremely fast (~2-5 ns). only does shallow copies. C# 9 recordsC# 9 reference types declared with the 'record' keyword. They get compiler-generated Equals, GetHashCode, ToString, and — critically for Prototype — a Clone method exposed through the 'with' expression syntax. 'var copy = original with { Name = "New" }' creates a shallow memberwise clone with one property overridden. give you with expressions — but that's still shallow. Deep cloning? You're on your own. Serialization, manual copy constructors, or libraries like MapperlyA .NET source generator for object mapping. Unlike AutoMapper (which uses reflection at runtime), Mapperly generates mapping code at compile time — zero reflection, zero allocations beyond the target object, and AOT-compatible. It's the modern replacement for hand-written mapping code.. The Prototype pattern itself is elegant; the .NET implementation story is a minefield.

What: Instead of calling new and configuring an object from scratch, you clone an existing prototypical instanceA pre-configured "template" object that serves as the source for all copies. Think of it as a master copy — you never use the prototype directly, you clone it and customize the copy. The prototype encapsulates the expensive or complex setup that you don't want to repeat.. The client asks the prototype to copy itself, getting a new object without knowing the concrete class. Clone, tweak, use.

When: Use when object creation is expensive (database lookups, file parsing, network calls to populate state), when you need many similar objects with small variations, or when you want to avoid a parallel hierarchy of factories for every product type.

Modern .NET: Use records with withC# 9+ syntax: var clone = original with { Price = 9.99m }; Creates a shallow memberwise copy with specified properties overridden. Under the hood, the compiler generates a protected copy constructor and a <Clone>$ method. Simple, readable, and type-safe — but still shallow only. for simple value-like objects, serialization-based deep cloneSerialize the object to bytes (via System.Text.Json, MessagePack, or protobuf-net), then deserialize into a new instance. This handles arbitrary object graphs including circular references (with the right serializer). Slower than MemberwiseClone but guarantees full independence. Typical cost: 500ns–50μs depending on object size and serializer. for complex graphs, and manual copy constructorsA constructor that takes an instance of its own type and creates a new instance by copying fields. In C#: public Document(Document source) { Title = source.Title; Sections = source.Sections.Select(s => new Section(s)).ToList(); } — gives you full control over deep vs shallow per field. when you need precise control over what gets deep-copied and what stays shared.

Quick Code:

Prototype-at-a-glance.cs
// The Prototype interface
public interface IPrototype<T>
{
    T DeepClone();
}

// A concrete prototype — a Document you can clone
public class Document : IPrototype<Document>
{
    public string Title { get; set; } = "";
    public List<string> Sections { get; set; } = new();

    public Document DeepClone()
    {
        return new Document
        {
            Title = this.Title,
            Sections = new List<string>(this.Sections)  // deep copy the list
        };
    }
}

// Usage: clone instead of constructing from scratch
var template = new Document { Title = "Quarterly Report", Sections = { "Intro", "Financials", "Summary" } };

var q1Report = template.DeepClone();  // independent copy
q1Report.Title = "Q1 2025 Report";
q1Report.Sections.Add("Q1 Appendix");  // doesn't affect template

var q2Report = template.DeepClone();  // another independent copy
q2Report.Title = "Q2 2025 Report";
Section 2

Prerequisites

Before reading this page, make sure you understand:
Reference vs Value Types — Prototype is ALL about copying. If you don't understand why var b = a; copies the reference (not the object) for classes, and copies the value for structs, every clone bug will blindside you. This is the single most important prerequisite. Object Creation Basics — Constructors, new keyword, object initializersC# syntax for setting properties at creation time: new Document { Title = "X", Author = "Y" }. Under the hood, the compiler calls the parameterless constructor then sets each property. Object initializers are syntactic sugar — they don't create a copy, they create a NEW object.. You need to know what Prototype is replacing — if you've never felt the pain of constructing complex objects by hand, the pattern won't click. Interfaces — The Prototype pattern uses an IPrototype<T> interface (or ICloneable) to decouple the client from concrete types. You should be comfortable defining and implementing interfaces in C#. Serialization Concepts — Deep cloning often uses serializationConverting an object graph into a byte stream (serialize) and reconstructing it back (deserialize). Common serializers in .NET: System.Text.Json, Newtonsoft.Json, MessagePack, protobuf-net. Serialization-based cloning works by serializing to bytes then deserializing — creating a completely independent copy of the entire object graph. as a shortcut. You don't need to be an expert, but knowing what JSON/binary serialization does will help you understand the trade-offs.
Section 3

Analogies

The Photocopier (Primary)

You write a perfect one-page memo — formatting, headers, logos, signatures, all dialed in. Now you need 50 copies. Do you sit down and hand-write each one from scratch? Of course not. You walk to the photocopierIn Prototype terms: the original memo is the prototypical instance (the prototype). The photocopier is the Clone() method. Each photocopy is a new independent object. You can then annotate individual copies (tweak the clone) without affecting the original or other copies., press a button, and get an identical copy. Then you scribble "ATTN: Engineering" on one copy and "ATTN: Marketing" on another. Each copy started identical but can diverge independently. That's Prototype: start from a pre-built original, clone it, customize the clone.

Real WorldCode Concept
Original memoPrototype instance (the template object)
PhotocopierClone() / DeepClone() method
Each photocopyNew cloned object (independent instance)
Scribbling notes on a copyModifying properties on the clone
Original stays pristinePrototype instance is unchanged after cloning
Cell Division (Mitosis)

A cell copies its DNA and splits into two identical daughter cells. Each cell is a fully independent organism that can then mutate on its own path. That's deep cloning — the "DNA" (nested data) is duplicated, not shared.

"Save As" on a Document

Like hitting "Save As" on a fully-formatted document — you start with a complete file and create an independent copy you can edit without affecting the original. The key insight: you're not building from scratch (that's Factory), you're duplicating something that already exists.

Carbon Copy Paper

Write on the top sheet, and the carbon paper creates a duplicate on the sheet below. But here's the shallow copy trap: if you erase something on the original, the carbon copy still has it. They're independent — but only if the copy was truly separate from the start.

Section 4

Core Concept Diagram

The Prototype pattern has one of the simplest structures in the GoF catalog. A Prototype interfaceDeclares the Clone() method. In the GoF book this is an abstract class; in modern C# it's typically an interface (IPrototype<T>) or sometimes just a convention. The interface guarantees every concrete prototype can copy itself without the client knowing the concrete type. declares Clone(), and concrete prototypes implement it. The client calls prototype.Clone() instead of new ConcretePrototype().

Prototype-UML
Prototype Pattern UML: Client uses Prototype interface, ConcretePrototype1 and ConcretePrototype2 implement it Client uses «interface» + Clone() : Prototype ConcretePrototype1 + Clone() : Prototype return copy of self ConcretePrototype2 + Clone() : Prototype return copy of self dependency implements
Key Insight: The client never calls new ConcretePrototype1(). It only calls prototype.Clone(). This means you can add ConcretePrototype3, ConcretePrototype4, etc. without touching the client — as long as they implement the Prototype interface. The client is decoupled from the concrete class hierarchy.
Section 5

Code Implementations

The classic shallow-vs-deep scenario. A Document has nested Section objects — shallow copy shares them, deep copy duplicates them.

DocumentPrototype.cs
public class Section
{
    public string Heading { get; set; } = "";
    public string Content { get; set; } = "";

    public Section DeepClone() => new()
    {
        Heading = this.Heading,
        Content = this.Content
    };
}

public class Document : IPrototype<Document>
{
    public string Title { get; set; } = "";
    public string Author { get; set; } = "";
    public DateTime Created { get; set; } = DateTime.UtcNow;
    public List<Section> Sections { get; set; } = new();
    public Dictionary<string, string> Metadata { get; set; } = new();

    // ❌ SHALLOW clone — Sections list is shared!
    public Document ShallowClone()
    {
        return (Document)this.MemberwiseClone();
        // Sections and Metadata are reference types →
        // clone.Sections IS the same list as original.Sections
    }

    // ✅ DEEP clone — everything is independent
    public Document DeepClone()
    {
        return new Document
        {
            Title = this.Title,
            Author = this.Author,
            Created = DateTime.UtcNow,  // new timestamp for the clone
            Sections = this.Sections.Select(s => s.DeepClone()).ToList(),
            Metadata = new Dictionary<string, string>(this.Metadata)
        };
    }
}

// Usage
var template = new Document
{
    Title = "Quarterly Report Template",
    Author = "Finance Team",
    Sections = { new Section { Heading = "Executive Summary" }, new Section { Heading = "Revenue" } },
    Metadata = { ["department"] = "finance", ["version"] = "2.0" }
};

var q1 = template.DeepClone();
q1.Title = "Q1 2025 Report";
q1.Sections.Add(new Section { Heading = "Q1 Appendix" });
// template.Sections.Count is still 2 — deep clone works!

Games spawn hundreds of enemies per level. Each enemy type has base stats, sprite references, and AI configurations. Constructing from scratch every time is wasteful — clone a pre-configured templateIn game development, these templates are often called "archetypes" or "prefabs" (Unity's term). The Prototype pattern formalizes this: store one fully-configured Goblin, one Orc, one Dragon, and clone them on demand instead of re-reading config files or re-loading assets for each spawn. instead.

GameEntityPrototype.cs
public class Stats
{
    public int Health { get; set; }
    public int Attack { get; set; }
    public int Defense { get; set; }
    public double Speed { get; set; }

    public Stats DeepClone() => new()
    {
        Health = this.Health, Attack = this.Attack,
        Defense = this.Defense, Speed = this.Speed
    };
}

public class AiConfig
{
    public string Behavior { get; set; } = "patrol";
    public double AggroRange { get; set; }
    public List<string> Abilities { get; set; } = new();

    public AiConfig DeepClone() => new()
    {
        Behavior = this.Behavior,
        AggroRange = this.AggroRange,
        Abilities = new List<string>(this.Abilities)
    };
}

public class GameEntity : IPrototype<GameEntity>
{
    public string Name { get; set; } = "";
    public string SpriteId { get; set; } = "";  // shared asset reference — OK to shallow copy
    public Stats Stats { get; set; } = new();
    public AiConfig Ai { get; set; } = new();
    public Guid InstanceId { get; private set; } = Guid.NewGuid();

    public GameEntity DeepClone()
    {
        return new GameEntity
        {
            Name = this.Name,
            SpriteId = this.SpriteId,          // string is immutable — safe to share
            Stats = this.Stats.DeepClone(),     // deep copy mutable stats
            Ai = this.Ai.DeepClone(),           // deep copy mutable AI config
            InstanceId = Guid.NewGuid()         // NEW unique ID for this clone
        };
    }
}

// Prototype Registry — pre-loaded enemy templates
public static class EnemyRegistry
{
    private static readonly Dictionary<string, GameEntity> _prototypes = new()
    {
        ["goblin"] = new GameEntity
        {
            Name = "Goblin", SpriteId = "sprites/goblin",
            Stats = new Stats { Health = 30, Attack = 8, Defense = 3, Speed = 1.2 },
            Ai = new AiConfig { Behavior = "patrol", AggroRange = 5, Abilities = { "slash" } }
        },
        ["orc"] = new GameEntity
        {
            Name = "Orc", SpriteId = "sprites/orc",
            Stats = new Stats { Health = 80, Attack = 15, Defense = 10, Speed = 0.8 },
            Ai = new AiConfig { Behavior = "guard", AggroRange = 8, Abilities = { "slam", "roar" } }
        }
    };

    public static GameEntity Spawn(string type)
    {
        if (!_prototypes.TryGetValue(type, out var proto))
            throw new ArgumentException($"Unknown enemy type: {type}");
        return proto.DeepClone();
    }
}

// Spawn 100 goblins — each with independent stats
var horde = Enumerable.Range(0, 100)
    .Select(_ => EnemyRegistry.Spawn("goblin"))
    .ToList();

horde[0].Stats.Health = 1;  // this goblin is almost dead
// horde[1].Stats.Health is still 30 — deep clone ensures independence

Applications often need multiple configurations that share 90% of their settings. Instead of building each from scratch, clone a base configuration and override the differences.

ConfigPrototype.cs
public class DatabaseConfig
{
    public string Host { get; set; } = "localhost";
    public int Port { get; set; } = 5432;
    public string Database { get; set; } = "";
    public int MaxPoolSize { get; set; } = 20;
    public TimeSpan CommandTimeout { get; set; } = TimeSpan.FromSeconds(30);
    public Dictionary<string, string> ExtraParams { get; set; } = new();

    public DatabaseConfig DeepClone() => new()
    {
        Host = this.Host, Port = this.Port, Database = this.Database,
        MaxPoolSize = this.MaxPoolSize, CommandTimeout = this.CommandTimeout,
        ExtraParams = new Dictionary<string, string>(this.ExtraParams)
    };
}

public class AppConfig : IPrototype<AppConfig>
{
    public string Environment { get; set; } = "development";
    public DatabaseConfig PrimaryDb { get; set; } = new();
    public DatabaseConfig? ReadReplicaDb { get; set; }
    public int MaxRetries { get; set; } = 3;
    public bool EnableCaching { get; set; } = true;
    public List<string> AllowedOrigins { get; set; } = new();
    public Dictionary<string, bool> FeatureFlags { get; set; } = new();

    public AppConfig DeepClone() => new()
    {
        Environment = this.Environment,
        PrimaryDb = this.PrimaryDb.DeepClone(),
        ReadReplicaDb = this.ReadReplicaDb?.DeepClone(),
        MaxRetries = this.MaxRetries,
        EnableCaching = this.EnableCaching,
        AllowedOrigins = new List<string>(this.AllowedOrigins),
        FeatureFlags = new Dictionary<string, bool>(this.FeatureFlags)
    };
}

// Base template — shared by all environments
var baseConfig = new AppConfig
{
    PrimaryDb = new DatabaseConfig { Host = "db.internal", Database = "myapp", MaxPoolSize = 50 },
    AllowedOrigins = { "https://myapp.com" },
    FeatureFlags = { ["dark-mode"] = true, ["beta-search"] = false }
};

// Staging: clone base, tweak environment-specific settings
var staging = baseConfig.DeepClone();
staging.Environment = "staging";
staging.PrimaryDb.Host = "staging-db.internal";
staging.PrimaryDb.MaxPoolSize = 10;
staging.FeatureFlags["beta-search"] = true;  // test beta in staging

// Production: clone base, different overrides
var production = baseConfig.DeepClone();
production.Environment = "production";
production.PrimaryDb.Host = "prod-primary.internal";
production.ReadReplicaDb = new DatabaseConfig { Host = "prod-replica.internal", Database = "myapp" };
production.AllowedOrigins.Add("https://api.myapp.com");

// baseConfig is untouched — each environment is fully independent
Section 6

Junior vs Senior Implementation

Problem Statement

Build a report generator that creates monthly reports for different departments. Each report shares 80% of its structure (header, footer, formatting, compliance sections) but differs in department-specific content. The base report takes 200ms to construct (template loading, formatting rules, compliance checks).

How a Junior Thinks

"I'll just copy each property manually from the old report to the new one. Or maybe I'll just call MemberwiseClone() — that clones everything, right?"

JuniorClone.cs
public class Report
{
    public string Department { get; set; } = "";
    public List<string> Sections { get; set; } = new();
    public Dictionary<string, decimal> Figures { get; set; } = new();
    public ComplianceData Compliance { get; set; } = new();

    // ❌ Junior approach #1: Manual property copy (misses nested objects)
    public Report CopyReport()
    {
        return new Report
        {
            Department = this.Department,
            Sections = this.Sections,           // ← SHARED reference!
            Figures = this.Figures,              // ← SHARED reference!
            Compliance = this.Compliance         // ← SHARED reference!
        };
    }

    // ❌ Junior approach #2: MemberwiseClone (same problem, hidden)
    public Report CloneReport() => (Report)this.MemberwiseClone();
}

// Bug in production:
var template = LoadExpensiveTemplate();  // 200ms
var engineering = template.CopyReport();
engineering.Department = "Engineering";
engineering.Sections.Add("Sprint Velocity");    // ← Also adds to template!
engineering.Figures["headcount"] = 42;          // ← Also modifies template!

var marketing = template.CopyReport();
marketing.Department = "Marketing";
// marketing.Sections already contains "Sprint Velocity" — from engineering!
// Template is corrupted. Every subsequent clone gets engineering's data.

Problems

Shared References Corrupt Data

Shallow copy means all reports share the same Sections list, Figures dictionary, and Compliance object. Modifying one report silently corrupts every other report — including the template.

No Type Safety on Clone

MemberwiseClone() returns object. Every call site needs a cast. If someone adds a subclass, the cast might fail at runtime with no compiler warning.

No Identity Reset

The clone keeps the template's ID, creation timestamp, and audit trail. Saving the clone creates a duplicate-key violation or overwrites the template's database record.

How a Senior Thinks

"I need a typed deep clone that creates fully independent copies. The clone must reset identity fields (ID, timestamps), deep-copy all mutable collections, and be testable — I should be able to verify that modifying a clone never affects the original."

IPrototype.cs
// ✅ Generic, typed prototype interface — no object casts needed
public interface IPrototype<out T> where T : class
{
    T DeepClone();
}
Report.cs
public class ComplianceData
{
    public bool GdprChecked { get; set; }
    public bool Sox404Checked { get; set; }
    public List<string> AuditNotes { get; set; } = new();

    public ComplianceData DeepClone() => new()
    {
        GdprChecked = this.GdprChecked,
        Sox404Checked = this.Sox404Checked,
        AuditNotes = new List<string>(this.AuditNotes)
    };
}

public class Report : IPrototype<Report>
{
    public Guid Id { get; private set; } = Guid.NewGuid();
    public DateTime Created { get; private set; } = DateTime.UtcNow;
    public string Department { get; set; } = "";
    public List<string> Sections { get; set; } = new();
    public Dictionary<string, decimal> Figures { get; set; } = new();
    public ComplianceData Compliance { get; set; } = new();

    public Report DeepClone()
    {
        return new Report
        {
            // Id and Created auto-generate new values (Guid.NewGuid, DateTime.UtcNow)
            Department = this.Department,
            Sections = new List<string>(this.Sections),                // deep copy list
            Figures = new Dictionary<string, decimal>(this.Figures),   // deep copy dict
            Compliance = this.Compliance.DeepClone()                   // deep copy object
        };
    }
}
ReportRegistry.cs
// ✅ Thread-safe prototype registry with immutable prototype storage
public sealed class ReportRegistry
{
    private readonly IReadOnlyDictionary<string, Report> _prototypes;

    public ReportRegistry()
    {
        var template = BuildExpensiveBaseReport();  // 200ms — only happens once
        _prototypes = new Dictionary<string, Report>
        {
            ["engineering"] = ConfigureForDepartment(template, "Engineering"),
            ["marketing"]   = ConfigureForDepartment(template, "Marketing"),
            ["finance"]     = ConfigureForDepartment(template, "Finance"),
        };
    }

    public Report Create(string department)
    {
        if (!_prototypes.TryGetValue(department.ToLowerInvariant(), out var proto))
            throw new ArgumentException($"No template for department: {department}");
        return proto.DeepClone();  // always returns independent copy
    }

    private static Report BuildExpensiveBaseReport()
    {
        // Simulates loading templates, parsing formatting rules, running compliance checks
        var report = new Report();
        report.Sections.AddRange(new[] { "Header", "Compliance", "Summary", "Footer" });
        report.Compliance = new ComplianceData { GdprChecked = true, Sox404Checked = true };
        return report;
    }

    private static Report ConfigureForDepartment(Report baseReport, string dept)
    {
        var clone = baseReport.DeepClone();
        clone.Department = dept;
        return clone;
    }
}

Design Decisions

Generic IPrototype<T> Instead of ICloneable

Returns the concrete type, not object. No casting, no runtime surprises. The out T covariance means a Report collection can hold any IPrototype<Report>.

Identity Fields Auto-Reset

Id and Created have private set and default to Guid.NewGuid() / DateTime.UtcNow. Every clone automatically gets a new identity — no manual reset needed, no forgotten-ID bugs.

Registry Isolates Expensive Setup

The 200ms template construction happens once in the constructor. Every subsequent Create() call is just a deep clone — microseconds instead of milliseconds. The registry also prevents accidental mutation of the prototypes.

Section 7

Evolution & History

1994–2002: GoF Book → ICloneable (.NET 1.0)

The GoF introduced Prototype as a way to avoid subclass explosion in factory hierarchies. When .NET 1.0 shipped in 2002, it included ICloneablenamespace System { public interface ICloneable { object Clone(); } } — One method, returns object, no guidance on depth. The Framework Design Guidelines (Cwalina & Abrams) explicitly recommend against implementing it: "Do not implement ICloneable. It is better to provide a custom Clone method." — a well-intentioned interface that immediately became problematic.

ICloneable-Era.cs
// .NET 1.0 approach — ICloneable
public class Customer : ICloneable
{
    public string Name { get; set; }
    public ArrayList Orders { get; set; }  // no generics yet!

    public object Clone()  // ← returns object, caller must cast
    {
        // Deep or shallow? The interface doesn't say!
        // Different classes in the BCL do it differently:
        // - Array.Clone() → shallow
        // - DataSet.Clone() → shallow (schema only, no data!)
        // - DataSet.Copy() → deep (schema + data)
        Customer clone = (Customer)this.MemberwiseClone();
        clone.Orders = (ArrayList)this.Orders.Clone();  // ArrayList.Clone is shallow too!
        return clone;
    }
}

// Caller has no idea if this is deep or shallow
Customer original = new Customer { Name = "Alice", Orders = new ArrayList { "Order1" } };
Customer copy = (Customer)original.Clone();  // ← cast required, could throw
Problems

1. Clone() returns object — needs an unsafe cast at every call site. 2. No contract for deep vs shallow — every implementer makes a different choice. 3. No generics yet, so collections like ArrayList compound the type-safety problem.

Generics arrived, and developers could finally create typed clone interfaces. The community converged on BinaryFormatterSystem.Runtime.Serialization.Formatters.Binary.BinaryFormatter — a serializer that converts objects to/from binary streams. It could serialize almost any object graph (including circular references) via [Serializable]. However, it was insecure (arbitrary code execution during deserialization) and was deprecated in .NET 5, then removed entirely in .NET 9. as a "universal deep clone" trick — serialize to bytes, deserialize back.

GenericsEra.cs
// Typed prototype interface — no more object casts
public interface IPrototype<T>
{
    T DeepClone();
}

// BinaryFormatter "universal" deep clone (pre-.NET 5)
[Serializable]
public class Order : IPrototype<Order>
{
    public int Id { get; set; }
    public List<OrderLine> Lines { get; set; } = new();

    public Order DeepClone()
    {
        using var stream = new MemoryStream();
        var formatter = new BinaryFormatter();
        formatter.Serialize(stream, this);
        stream.Position = 0;
        return (Order)formatter.Deserialize(stream);
    }
}

// Generic extension method — clone anything [Serializable]
public static class CloneExtensions
{
    public static T DeepClone<T>(this T source) where T : class
    {
        using var stream = new MemoryStream();
        var formatter = new BinaryFormatter();
        formatter.Serialize(stream, source);
        stream.Position = 0;
        return (T)formatter.Deserialize(stream);
    }
}
Problems

1. BinaryFormatter is a security vulnerabilityBinaryFormatter deserialization can execute arbitrary code. An attacker who controls the serialized byte stream can craft a payload that runs malicious code during deserialization. This is a Remote Code Execution (RCE) vulnerability. Microsoft removed BinaryFormatter entirely in .NET 9. — arbitrary code execution during deserialization. 2. Requires [Serializable] on every class in the graph. 3. Very slow compared to manual cloning (~100x slower). Microsoft deprecated it in .NET 5.

With BinaryFormatter deprecated, the community shifted to System.Text.Json (STJ)Microsoft's built-in JSON serializer, shipping with .NET Core 3.0+. Much faster than Newtonsoft.Json for simple cases, supports source generation for AOT scenarios (.NET 6+), but doesn't handle circular references by default (added via ReferenceHandler.Preserve in .NET 6). as the serialization-based cloning mechanism. Source generatorsA C# compiler feature (Roslyn) that generates additional source code at compile time. Unlike reflection (which inspects types at runtime), source generators emit code during the build. This means zero runtime overhead, full AOT compatibility, and compile-time type safety. Used by System.Text.Json, Mapperly, and others. made it AOT-friendly.

JsonClone.cs
using System.Text.Json;

// JSON-based deep clone — no [Serializable] needed
public static class JsonCloneExtensions
{
    private static readonly JsonSerializerOptions Options = new()
    {
        ReferenceHandler = System.Text.Json.Serialization.ReferenceHandler.Preserve,
        WriteIndented = false
    };

    public static T DeepClone<T>(this T source) where T : class
    {
        var json = JsonSerializer.Serialize(source, Options);
        return JsonSerializer.Deserialize<T>(json, Options)!;
    }
}

// Usage
var original = new Order { Id = 1, Lines = { new OrderLine { Product = "Widget", Qty = 5 } } };
var clone = original.DeepClone();  // fully independent copy via JSON round-trip

C# 9 records gave us the closest thing to language-level Prototype support. The compiler generates a copy constructorWhen you declare 'public record Person(string Name, int Age)', the compiler generates: protected Person(Person original) { this.Name = original.Name; this.Age = original.Age; } — plus a hidden <Clone>$ method that calls this copy constructor. The 'with' expression uses this under the hood. and a <Clone>$ method automatically.

RecordPrototype.cs
// Records: compiler-generated Prototype pattern!
public record Address(string Street, string City, string Zip);
public record Person(string Name, int Age, Address Address);

var alice = new Person("Alice", 30, new Address("123 Main St", "Springfield", "62701"));

// 'with' expression — clone with modifications
var bob = alice with { Name = "Bob", Age = 25 };
// bob.Address is the SAME reference as alice.Address — shallow copy!

// For deep semantics, use nested records (they're immutable, so sharing is safe)
// But if Address were a mutable class? Changing bob.Address.City changes alice too!

// Record structs (C# 10) — value-type records, always deep copied
public record struct Money(decimal Amount, string Currency);
var price = new Money(9.99m, "USD");
var discounted = price with { Amount = 7.99m };  // truly independent (value type)
Caveat: Records with with do shallow memberwise cloning. This is fine when all properties are immutable (strings, ints, other records), but dangerous with mutable reference types (List, Dictionary, mutable classes). Immutable data + records = the modern alternative to Prototype for many use cases.

The modern .NET ecosystem offers multiple cloning strategies: MapperlyA .NET source generator by Riok that generates object-to-object mapping code at compile time. Zero reflection, zero runtime allocations beyond the target, AOT-compatible. Can be used for cloning by mapping a type to itself: [MapperIgnoreTarget(nameof(Id))] partial T Clone(T source); source generators for zero-reflection mapping, MessagePackA binary serialization format that's much faster and more compact than JSON. The MessagePack-CSharp library by neuecc supports AOT via source generation, handles circular references, and is the go-to choice when you need serialization-based deep clone with performance. Typically 3-10x faster than System.Text.Json for cloning (gap narrows with STJ source generators). for fast binary deep clone, and immutable data patterns that sidestep cloning entirely.

ModernClone.cs
// Approach 1: Mapperly source-generated clone
[Mapper]
public static partial class ReportMapper
{
    [MapperIgnoreTarget(nameof(Report.Id))]      // new ID per clone
    [MapperIgnoreTarget(nameof(Report.Created))]  // new timestamp per clone
    public static partial Report Clone(Report source);
}

// Approach 2: MessagePack binary deep clone
[MessagePackObject]
public class GameEntity
{
    [Key(0)] public string Name { get; set; } = "";
    [Key(1)] public Stats Stats { get; set; } = new();
    [Key(2)] public AiConfig Ai { get; set; } = new();

    public GameEntity DeepClone()
    {
        var bytes = MessagePackSerializer.Serialize(this);
        return MessagePackSerializer.Deserialize<GameEntity>(bytes);
    }
}

// Approach 3: Immutability — no cloning needed
public record ImmutableConfig(
    string Environment,
    string DbHost,
    int DbPort,
    IReadOnlyList<string> AllowedOrigins);

var prod = new ImmutableConfig("production", "prod-db", 5432,
    new[] { "https://myapp.com" });
// "Clone with changes" via with — no mutation possible
var staging = prod with { Environment = "staging", DbHost = "staging-db" };
Section 8

Prototype in .NET Core

The Prototype pattern appears throughout the .NET ecosystem — though often under different names. Here are the most important examples in detail:

Object.MemberwiseClone() — The Built-in Shallow Prototype

Every .NET object inherits MemberwiseClone() from System.Object. It's protected, so only the class itself (or subclasses) can call it. It creates a bitwise copyMemberwiseClone performs a shallow, field-by-field copy. Value-type fields are copied by value (independent). Reference-type fields are copied by reference (shared). It's implemented as a native CLR intrinsic — extremely fast (~2-5 nanoseconds) because it's essentially a memory block copy. — extremely fast but always shallow.

MemberwiseClone.cs
public class ConnectionSettings
{
    public string Host { get; set; } = "";
    public int Port { get; set; }
    public TimeSpan Timeout { get; set; }
    // All value types + immutable string → MemberwiseClone is SAFE here

    public ConnectionSettings ShallowClone() => (ConnectionSettings)MemberwiseClone();
}

// Safe usage — all fields are value types or immutable
var original = new ConnectionSettings { Host = "db.prod", Port = 5432, Timeout = TimeSpan.FromSeconds(30) };
var clone = original.ShallowClone();
clone.Port = 3306;  // independent — Port is a value type
// original.Port is still 5432

C# records are the most idiomatic way to implement Prototype in modern .NET. The compiler generates a hidden <Clone>$ method and a copy constructor. The with expression is syntactic sugar for "clone, then override specific properties."

RecordWith.cs
// The compiler generates Clone + copy constructor for you
public record HttpRequestOptions(
    string Url,
    string Method,
    IReadOnlyDictionary<string, string> Headers,
    TimeSpan Timeout);

var defaultOpts = new HttpRequestOptions(
    Url: "", Method: "GET",
    Headers: new Dictionary<string, string> { ["Accept"] = "application/json" },
    Timeout: TimeSpan.FromSeconds(30));

// 'with' = clone + override
var postOpts = defaultOpts with { Method = "POST", Url = "/api/orders" };
var longOpts = defaultOpts with { Timeout = TimeSpan.FromMinutes(5) };

// Under the hood, the compiler emits:
// var temp = defaultOpts.<Clone>$();   // MemberwiseClone
// temp.Method = "POST";
// temp.Url = "/api/orders";
// postOpts = temp;

Entity Framework Core's change trackerEF Core tracks every entity loaded from the database. When you call SaveChanges(), it compares current values to original values and generates UPDATE statements. This tracking means entities are "attached" to a DbContext — you can't just clone them and save the clone without detaching first or using AsNoTracking. complicates cloning. The common pattern for "duplicate a database record" is: load with AsNoTracking(), reset the primary key, then Add() the clone.

EfCoreClone.cs
// Clone an EF Core entity (e.g., duplicate a product template)
public async Task<Product> CloneProductAsync(int sourceId, AppDbContext db)
{
    // Load without tracking — gives us a "detached" entity
    var source = await db.Products
        .AsNoTracking()
        .Include(p => p.Variants)
        .Include(p => p.Tags)
        .FirstAsync(p => p.Id == sourceId);

    // Reset identity — EF will generate new IDs on save
    source.Id = 0;
    source.Name = $"{source.Name} (Copy)";
    source.CreatedAt = DateTime.UtcNow;
    foreach (var variant in source.Variants)
        variant.Id = 0;  // reset child IDs too

    db.Products.Add(source);  // add as new entity
    await db.SaveChangesAsync();
    return source;
}

A notable absence of Prototype in .NET: HttpRequestMessage cannot be cloned or reused. Once sent, it's disposed. If you need to retry, you must rebuild it from scratch — a design decision that forces developers to use the PollyA .NET resilience and transient-fault-handling library. It provides policies like Retry, Circuit Breaker, Timeout, and Bulkhead. In .NET 8+, Microsoft.Extensions.Http.Resilience wraps Polly and integrates with HttpClientFactory for automatic retry policies on HTTP calls. retry pattern with request factories instead of cloning.

HttpRequestFactory.cs
// ❌ Can't do this — HttpRequestMessage is disposed after sending
var request = new HttpRequestMessage(HttpMethod.Get, "/api/data");
await client.SendAsync(request);
await client.SendAsync(request);  // ObjectDisposedException!

// ✅ Factory pattern instead of Prototype
Func<HttpRequestMessage> requestFactory = () =>
    new HttpRequestMessage(HttpMethod.Get, "/api/data")
    {
        Headers = { { "Authorization", $"Bearer {token}" } }
    };

// Polly retry creates a fresh request each time
var retryPolicy = Policy.Handle<HttpRequestException>()
    .RetryAsync(3);
await retryPolicy.ExecuteAsync(async () =>
    await client.SendAsync(requestFactory()));  // new message each retry

System.Collections.ImmutableA NuGet package (included in .NET Core) providing immutable collection types: ImmutableList<T>, ImmutableDictionary<K,V>, ImmutableArray<T>, etc. Add/Remove operations return NEW collections without modifying the original. Internally, they use structural sharing (balanced trees) so the "copy" only allocates O(log n) new nodes instead of O(n). collections use a technique called structural sharing — related to Prototype but more efficient. Instead of cloning the entire collection, Add() returns a new collection that shares most of its internal structure with the original.

ImmutableCollections.cs
using System.Collections.Immutable;

// Immutable list — "clone with modification" without copying everything
var original = ImmutableList.Create("A", "B", "C");
var modified = original.Add("D");  // returns NEW list ["A","B","C","D"]
// original is still ["A","B","C"] — but internally they share nodes

// Immutable dictionary — great for feature flag prototypes
var baseFlags = ImmutableDictionary<string, bool>.Empty
    .Add("dark-mode", true)
    .Add("beta-search", false);

var stagingFlags = baseFlags.SetItem("beta-search", true);  // "clone + override"
// baseFlags["beta-search"] is still false
// stagingFlags["beta-search"] is true
// Internal nodes are shared — efficient even with 10,000 keys
Section 9

When To Use / When Not To

Use When

Object creation is expensiveDatabase queriesLoading an entity from a database involves network I/O, query parsing, and result materialization. A single query can take 1-50ms. If you need 100 similar objects, cloning one query result is far faster than executing 100 queries., file parsing, network calls, or complex calculations are needed to build the object. Clone the result instead of repeating the work.
Many similar objects with small variations — Reports for different departments, game enemies with tweaked stats, config per environment. Clone a template, change what differs.
Avoiding parallel factory hierarchiesIn Factory Method, each product variant needs a corresponding creator subclass: GoblinFactory, OrcFactory, DragonFactory. With 20 enemy types, that's 20 factory classes. Prototype eliminates this — just store 20 prototype instances in a dictionary. — Instead of creating a factory for every product variation, store prototype instances and clone them on demand.
Runtime-configured object creation — When the types to create are specified at runtime (loaded from config, user-defined templates), store prototypical instances in a prototype registryA centralized lookup (usually a Dictionary<string, IPrototype>) that maps names/keys to pre-configured prototype instances. The client asks the registry for a clone by name, without knowing or caring about the concrete type. Common in game dev, document systems, and plugin architectures..
Undo/snapshotUndo functionality works by saving the object's state before each mutation. If the user clicks undo, restore the saved snapshot. Prototype provides the cloning mechanism to create these snapshots. Combined with the Memento pattern (which manages the snapshot storage), you get a complete undo system. functionality — Clone the current state before a mutation. If the user clicks "undo," restore the clone. (Related to the Memento patternA GoF behavioral pattern that captures an object's internal state so it can be restored later. Prototype and Memento are often used together: Prototype provides the cloning mechanism, Memento provides the storage/restoration protocol. The clone IS the memento..)

Don't Use When

Objects are simple to construct — If new Thing(a, b, c) takes microseconds and has no complex setup, cloning adds complexity for zero benefit.
Objects are immutableAn immutable object cannot be modified after creation. In C#, strings are immutable (every "modification" creates a new string). Records with init-only properties are immutable. ImmutableList<T> returns new collections on every Add/Remove. Immutable objects are inherently thread-safe and safe to share without cloning. — Immutable types (strings, records with only immutable fields, ImmutableList) don't need cloning. They're already safe to share — use with expressions instead.
Only a few properties differ — If the variations are predictable and few, a BuilderThe Builder pattern constructs objects step-by-step with explicit configuration. When you know exactly what varies (3-4 properties), Builder is clearer than "clone everything then change 3 things." Prototype shines when the variation is small relative to the total setup cost. or factoryFactory patterns (Factory Method, Abstract Factory) create objects through dedicated creator classes or methods. They're ideal when creation logic varies per type. Unlike Prototype, factories construct from scratch — they don't start from an existing instance. with parameters is more readable than "clone then modify."
Circular references are complex — Deep cloning objects with circular references (A → B → A) requires special handling. If your object graph is a tangled web, serialization-based cloning or a visitor pattern may be better than manual Prototype.
Objects have external resources — Cloning a database connectionA DbConnection (SqlConnection, NpgsqlConnection) represents an open TCP socket to a database server. Cloning it would mean two objects sharing one socket — concurrent reads/writes would corrupt the protocol. Database connections are managed by connection pools, not by cloning., file handle, or HTTP client is dangerous or meaningless. Resources should be managed, not cloned.
Object expensive to create? NO Just use new YES Need many variations? NO Builder (complex, single output) YES Object immutable (record)? YES with expression ✓ NO Prototype ✓ ICloneable / DeepCopy
Section 10

Comparisons

Prototype vs Factory Method

PrototypeCreates new objects by cloning an existing instance. The "factory" is the prototype itself — it knows how to copy itself. No need for a parallel hierarchy of creator classes.
  • Creates by cloning an existing instance
  • No class hierarchy needed — just objects + Clone()
  • Variants are runtime-configured (swap prototypes)
  • Best when objects are expensive to construct
VS
Factory MethodDefines an interface for creating objects but lets subclasses decide which class to instantiate. Each product variant requires a corresponding creator subclass — leading to parallel hierarchies that Prototype avoids.
  • Creates via subclass override of a factory method
  • Requires a creator class per product variant
  • Variants are compile-time class hierarchy
  • Best when creation logic differs per variant

Prototype vs Builder

Prototype
  • Starts from a fully-formed object and clones it
  • Tweaks a few properties post-clone
  • Fast when the base object is expensive to create
  • Risk: shallow copy bugs if clone is incomplete
VS
BuilderConstructs complex objects step-by-step. Unlike Prototype (which starts with everything and tweaks), Builder starts with nothing and adds pieces. Builder guarantees a valid object via the Build() method; Prototype assumes the source was already valid.
  • Starts from nothing and builds step-by-step
  • Explicit control over every property
  • Always constructs from scratch (no cloning shortcuts)
  • Validates at Build() time — guaranteed valid

Prototype vs new Keyword

Prototype (Clone)
  • Client doesn't know the concrete type
  • Preserves expensive pre-computed state
  • Dynamic: swap prototype at runtime → different products
  • Complexity: must implement and maintain Clone()
VS
new Keyword
  • Client is coupled to the concrete type
  • Reconstructs everything from scratch each time
  • Static: changing the type means changing the code
  • Simple: no clone logic, no copy bugs
Section 11

SOLID Connections

PrincipleRelationExplanation
SRPSingle Responsibility Principle — a class should have only one reason to change. In Prototype context: the Clone() method is the object's responsibility because only the object knows its own internal state well enough to copy it correctly. Supports Clone logic belongs to the object itself — only it knows which fields need deep copying. Putting clone logic elsewhere (a separate "cloner" class) would require exposing internal state, violating encapsulationThe OOP principle of hiding an object's internal state and requiring all interaction through well-defined methods. If a Cloner class needs to copy private fields, those fields must become public — breaking encapsulation. Self-cloning preserves it because the object has full access to its own private state..
OCPOpen/Closed Principle — classes should be open for extension but closed for modification. Prototype supports OCP because adding new clonable types doesn't require modifying existing client code or factory hierarchies. Supports New prototype types can be added without modifying the client or the registry. Add a new class that implements IPrototype<T>, register it, and the client clones it without code changes.
LSPLiskov Substitution Principle — subtypes must be substitutable for their base types. A cloned object must behave exactly like the original — if Clone() returns an object that violates the original's invariants (e.g., missing required fields), it violates LSP. Depends The cloned object MUST honor the original's contract. If Clone() produces an object that fails validation (missing required fields, broken invariants), it violates LSP. Test that clones pass the same validation as originals.
ISPInterface Segregation Principle — clients shouldn't be forced to depend on methods they don't use. ICloneable violates ISP by returning 'object' (forcing casts). A generic IPrototype<T> is better because it's specific and type-safe. Supports IPrototype<T> has exactly one method: DeepClone(). It's a focused, single-purpose interface. Compare with ICloneable which technically has one method too — but the untyped return violates the spirit of ISP by forcing callers to know the concrete type anyway.
DIPDependency Inversion Principle — depend on abstractions, not concretions. The client depends on IPrototype<T> (abstraction), not on ConcretePrototype (implementation). The prototype registry can inject different prototypes without the client knowing. Supports The client calls IPrototype<T>.DeepClone() without knowing the concrete type. A prototype registryA Dictionary<string, IPrototype<T>> that maps keys to prototype instances. The client asks for a clone by key, and the registry returns a deep copy. The client never references the concrete class — pure DIP. can swap implementations at runtime, and the client is completely decoupled.
Section 12

Bug Case Studies

Bug 1: Shallow Copy Shares List — Mutating Clone Corrupts Original

The Incident

Tuesday 2:30 PM. The e-commerce team ships a shiny new "duplicate product" feature. Product managers love it — clone a template product, tweak a few fields, and publish. Easy. Within three hours, support tickets start flooding in: "My template product's tags keep changing on their own!"

Here's the story, step by step. The team has a template product in the database with Tags = ["electronics", "sale"]. When a product manager clicks "Duplicate," the system calls template.Clone() to make a copy. The clone method uses MemberwiseClone(), which copies every field. Sounds fine, right?

The trap is that MemberwiseClone() does a shallow copyA copy that duplicates the outer object but shares internal references. Think of it like photocopying a folder of sticky notes — you get a new folder, but the sticky notes inside are the same ones. If someone moves a sticky note in your copy, it disappears from the original too.. For simple values like Price = 29.99m or Name = "Widget", it creates independent copies — great. But for the Tags list, it copies the reference (the arrow pointing to the list), not the list itself. So the template and the clone both point to the exact same list in memory.

User A clones the template and adds "featured" to their product's tags. Because the clone's Tags list IS the template's Tags list, the template now has ["electronics", "sale", "featured"]. Five minutes later, User B clones the same template — and their "fresh" clone already has the "featured" tag that User A added. User B adds "clearance", which also mutates the template. The corruption snowballs with every clone.

The bug was intermittentA bug that only appears sometimes, depending on timing, data, or environment. Shallow copy bugs are often intermittent because they depend on the ORDER of operations: clone A, modify A, clone B — B sees A's changes. If B is cloned before A is modified, the bug doesn't appear. — it only showed up when two users cloned the same template in a short window. If only one person cloned, they never noticed the shared reference. It took 4 hours of investigation before someone realized the Tags list was the same object in memory.

Template Product Name = "Widget" Price = 29.99 Tags = ──────► Clone (User A) Name = "User A's" ✓ own copy Tags = ──────► SHARED List (one in memory!) ["electronics", "sale", "featured", "clearance"] Both point here! Mutating one mutates ALL
ShallowCopyBug.cs
public class Product
{
    public int Id { get; set; }
    public string Name { get; set; } = "";
    public decimal Price { get; set; }
    public List<string> Tags { get; set; } = new();       // ❌ mutable reference type
    public Dictionary<string, string> Attrs { get; set; } = new(); // ❌ mutable reference type

    public Product Clone() => (Product)MemberwiseClone(); // ❌ shallow copy!
}

// User A clones template
var template = LoadTemplate();                  // Tags = ["electronics", "sale"]
var userA = template.Clone();
userA.Name = "User A's Product";
userA.Tags.Add("featured");                     // ❌ Also adds to template.Tags!

// User B clones the SAME template 5 minutes later
var userB = template.Clone();
userB.Tags.Add("clearance");                    // ❌ Also adds to template.Tags!

// template.Tags is now ["electronics", "sale", "featured", "clearance"]
// Every subsequent clone gets ALL tags from previous clones
// The database is now full of products with wrong tags

Walking through the buggy code: The Product class uses MemberwiseClone() which copies all fields at face value. For Name (a string, which is immutable in C#), this is perfectly safe — each clone gets its own independent string. But Tags is a List<string>, which is a mutable reference type. MemberwiseClone() copies the pointer to the list, not the list itself. So template.Tags, userA.Tags, and userB.Tags are all the exact same list object. Adding a tag to any of them adds it to all of them.

DeepCopyFix.cs
public class Product : IPrototype<Product>
{
    public int Id { get; set; }
    public string Name { get; set; } = "";
    public decimal Price { get; set; }
    public List<string> Tags { get; set; } = new();
    public Dictionary<string, string> Attrs { get; set; } = new();

    public Product DeepClone() => new()
    {
        // ✅ Id = 0 → database will auto-generate new ID
        Name = this.Name,
        Price = this.Price,
        Tags = new List<string>(this.Tags),                     // ✅ new list
        Attrs = new Dictionary<string, string>(this.Attrs)      // ✅ new dict
    };
}

Why the fix works: Instead of relying on MemberwiseClone(), the fix manually constructs a new Product and creates brand-new collection instances. new List<string>(this.Tags) creates a fresh list with the same elements — but it's a completely separate list in memory. Adding "featured" to the clone's Tags only affects the clone's list. The template's list stays untouched. The dictionary gets the same treatment. Each clone is now fully independent.

How to Spot This in Your Code

Search for MemberwiseClone() in your codebase. For every class that uses it, check: does the class have ANY field that's a List<T>, Dictionary<K,V>, array, or custom class? If yes, that field is being shared between original and clone. Write a unit test: clone an object, mutate a collection field on the clone, then assert the original's collection is unchanged.

Lesson Learned

MemberwiseClone() is safe ONLY when all fields are value types or immutable. The moment you have a List<T>, Dictionary<K,V>, or any mutable class — you MUST deep copy those fields explicitly.

The Incident

Friday 5:47 PM. A developer implements ICloneable on a base class Shape. Subclasses Circle and Rectangle inherit the clone. A new developer joins the team and adds Triangle extending Shape. Everything compiles, tests pass, the feature ships.

Two weeks later, QA files a bizarre ticket: "Triangles are rendering as tiny dots." Triangles that were cloned from templates appeared on screen as invisible or single-pixel shapes. Non-cloned triangles rendered perfectly.

Here's what happened. The original developer wrote Shape.Clone() using MemberwiseClone(), which copies ALL fields (even subclass fields). That worked fine for Circle and Rectangle. But the new developer looked at the Clone() method, saw it returned object, and thought: "I should override this with a proper implementation." So they manually constructed a new Triangle and copied the base class fields (X, Y, Color) — but forgot SideA, SideB, and SideC.

The Clone() call compiled fine. The cast to Triangle worked at runtime. No exception, no warning, no error. But the cloned triangle's sides were all 0.0 (the default for double). A triangle with zero-length sides is a dot — or nothing at all. The bug was silent. The return type object gave the compiler nothing to check against, so the missing fields were invisible until QA caught it visually.

Diagnosis took 45 minutes. No exception was thrown, so there was no stack trace to follow. The team had to trace every cloned shape back to its source and compare field values to discover the zeroed-out sides.

Original Triangle X = 10, Y = 20 Color = "red" SideA = 3, SideB = 4 SideC = 5 .Clone() object no type info! (cast) Cloned Triangle X = 10, Y = 20 ✓ Color = "red" ✓ SideA = 0 ✗ LOST SideB = 0, SideC = 0 ✗ No compiler warning! No runtime exception! Silent data loss only caught by visual inspection
ICloneableBug.cs
public abstract class Shape : ICloneable
{
    public double X { get; set; }
    public double Y { get; set; }
    public string Color { get; set; } = "black";

    public virtual object Clone() => MemberwiseClone();  // ❌ returns object
}

public class Triangle : Shape
{
    public double SideA { get; set; }
    public double SideB { get; set; }
    public double SideC { get; set; }

    // ❌ New developer manually constructs instead of using MemberwiseClone
    public override object Clone()
    {
        return new Triangle  // ← Manually reconstructing instead of MemberwiseClone
        {
            X = this.X, Y = this.Y, Color = this.Color
            // Oops! Forgot SideA, SideB, SideC — they default to 0
        };
    }
}

// In production code:
Shape shape = GetShapeFromDatabase(); // actually a Triangle
var copy = (Triangle)shape.Clone();   // ❌ Cast works, but SideA/B/C are all 0!

Walking through the buggy code: The base class Shape implements ICloneable, which forces Clone() to return object. The new developer overrides Clone() in Triangle and manually constructs a new triangle. They remember to copy the base class fields (X, Y, Color), but forget the triangle-specific fields (SideA, SideB, SideC). Because the return type is object, the compiler has no idea what fields should be there. No warning, no error. The double fields default to 0.0 silently.

TypedCloneFix.cs
// ✅ Generic interface — compiler enforces correct return type
public interface IPrototype<out T> where T : class
{
    T DeepClone();
}

public abstract class Shape
{
    public double X { get; set; }
    public double Y { get; set; }
    public string Color { get; set; } = "black";
}

public class Triangle : Shape, IPrototype<Triangle>
{
    public double SideA { get; set; }
    public double SideB { get; set; }
    public double SideC { get; set; }

    public Triangle DeepClone() => new()    // ✅ returns Triangle, not object
    {
        X = this.X, Y = this.Y, Color = this.Color,
        SideA = this.SideA, SideB = this.SideB, SideC = this.SideC
    };
}

// No cast needed — type is correct at compile time
Triangle original = new() { X = 0, Y = 0, SideA = 3, SideB = 4, SideC = 5 };
Triangle copy = original.DeepClone();  // ✅ returns Triangle directly

Why the fix works: By using a generic IPrototype<T> interface, each class explicitly returns its own concrete type. Triangle.DeepClone() returns Triangle, not object. The caller gets type-safe access without casting. And because the developer is writing Triangle DeepClone() (not object Clone()), it's much more obvious that all Triangle-specific fields need to be included. You can also add a reflection-based test to catch forgotten fields automatically (see Q14 in Interview section).

How to Spot This in Your Code

Search for : ICloneable in your codebase. Every class that implements it is at risk. Also search for (SomeType)something.Clone() — any cast after a Clone call means you're relying on runtime safety instead of compile-time safety. Replace with a typed IPrototype<T> interface.

Lesson Learned

Never use ICloneable — its object return type is a runtime bomb. Use a generic IPrototype<T> interface so the compiler catches type mismatches before they reach production.

The Incident

Wednesday 11 AM. The HR team reports the org-chart system is crashing. Every time someone tries to duplicate a department, the server crashes with a StackOverflowException. No useful stack trace, no recovery — the whole process dies instantly.

The system has an Employee class where each employee has a Manager (who is also an Employee) and a list of DirectReports (also Employees). This creates a two-way relationship: the CEO's DirectReports include the VP, and the VP's Manager is the CEO. They point at each other.

When someone clones the CEO, the deep clone method tries to clone every field. It clones the CEO's DirectReports list, which means cloning the VP. Cloning the VP means cloning the VP's Manager, which is... the CEO. So it tries to clone the CEO again. The CEO's DirectReports include the VP again. The VP's Manager is the CEO again. And so on, forever — like two mirrors facing each other creating an infinite reflection.

Think of it like this: imagine photocopying a family tree where every parent card says "see children" and every child card says "see parent." You'd flip back and forth between the same cards forever, never finishing the copy. That's exactly what the code does — it bounces between CEO and VP endlessly until the call stack runs out of space.

The StackOverflowException is particularly nasty because it kills the process immediately — no catch block can save you. It took 30 minutes to diagnose because the exception's stack trace was thousands of identical frames deep, making it hard to see the pattern. Adding a depth counter to the clone method finally revealed the circular dependency.

CEO Manager = null DirectReports = [VP] VP Manager = CEO DirectReports = [Dev] clone DirectReports clone Manager StackOverflowException! Clone CEO → Clone VP → Clone CEO → Clone VP → ... Process killed. No catch. No recovery.
CircularRefBug.cs
public class Employee : IPrototype<Employee>
{
    public string Name { get; set; } = "";
    public Employee? Manager { get; set; }            // ← circular ref!
    public List<Employee> DirectReports { get; set; } = new();  // ← circular ref!

    public Employee DeepClone()
    {
        return new Employee
        {
            Name = this.Name,
            Manager = this.Manager?.DeepClone(),  // ❌ if Manager has DirectReports
            DirectReports = this.DirectReports    //    that includes THIS employee...
                .Select(e => e.DeepClone())       // ❌ infinite recursion!
                .ToList()
        };
    }
}

// CEO → VP (manager: CEO) → Dev (manager: VP)
// Cloning CEO → clones VP → VP clones Manager (CEO) → CEO clones VP → BOOM

Walking through the buggy code: When we call ceo.DeepClone(), the method creates a new Employee, copies the Name, then tries to clone the Manager (null for CEO, so that's fine). Then it iterates through DirectReports and calls DeepClone() on each one. The VP is in that list, so it clones the VP. The VP's DeepClone() tries to clone its Manager — which is the CEO. And the CEO's clone tries to clone its DirectReports — which includes the VP. Neither side ever stops calling the other. Each call adds a frame to the call stack until the stack overflows.

CircularRefFix.cs
public class Employee
{
    public string Name { get; set; } = "";
    public Employee? Manager { get; set; }
    public List<Employee> DirectReports { get; set; } = new();

    // ✅ Use a visited-dictionary to break circular references
    public Employee DeepClone(Dictionary<Employee, Employee>? visited = null)
    {
        visited ??= new Dictionary<Employee, Employee>(ReferenceEqualityComparer.Instance);

        if (visited.TryGetValue(this, out var existing))
            return existing;  // already cloned this employee — return the clone

        var clone = new Employee { Name = this.Name };
        visited[this] = clone;  // register BEFORE cloning children

        clone.Manager = this.Manager?.DeepClone(visited);
        clone.DirectReports = this.DirectReports
            .Select(e => e.DeepClone(visited))
            .ToList();

        return clone;
    }
}

// Now: CEO.DeepClone() → clones VP → VP.Manager references CEO
// → visited[CEO] already exists → returns the clone, no infinite loop

Why the fix works: The secret weapon is a visited dictionary that remembers every object we've already started cloning. Before cloning anything, we check: "Have I seen this object before?" If yes, return the clone we already created. The critical detail is registering the clone in the dictionary before filling in its child fields. This way, when the VP tries to clone its Manager (the CEO), it finds the CEO's clone already in the dictionary and returns it — breaking the cycle. The empty CEO clone gets its children filled in after the recursive calls return.

How to Spot This in Your Code

Look for classes that reference themselves — either directly (a Manager field of the same type) or indirectly (Parent/Child relationships, graph edges, linked list nodes). If any of these classes have a DeepClone() method without a visited dictionary parameter, they're vulnerable to infinite recursion. Tree structures (where references only go downward) are safe, but any bidirectional relationship is dangerous.

Lesson Learned

Any object graph with bidirectional references needs a visited dictionary during deep clone. Register the clone BEFORE recursing into children. Alternatively, use a serializer that handles cycles (e.g., System.Text.Json with ReferenceHandler.Preserve).

The Incident

Monday 9 AM. A desktop app (WPF) has a split-panel view. Users can "duplicate" a panel to compare two configurations side by side. The developer clones the ViewModel to create the second panel. After cloning, editing the Name field in Panel B causes Panel A to update. Typing in Panel B makes Panel A flicker with the same text. Users report the app is "haunted."

Here's why this happens. In WPF (and other UI frameworks), when you bind a text box to a ViewModel property, the framework subscribes to the ViewModel's PropertyChanged event. When the property changes, the event fires, and the UI knows to refresh. The original ViewModel had Panel A's UI subscribed to its event.

When the developer cloned the ViewModel using MemberwiseClone(), the event field (which is a delegateA delegate in C# is essentially a pointer to a method (or a list of methods). Events are backed by delegate fields. When you do someEvent += handler, you're adding a method pointer to the delegate's invocation list. MemberwiseClone copies this pointer, so the clone inherits all the original's subscribers.) got copied along with everything else. The clone now has a PropertyChanged event that still points to Panel A's update handler. When the clone fires its event (because someone typed in Panel B), Panel A's handler runs and refreshes Panel A's UI with Panel B's data.

Think of it like this: the original ViewModel has a phone number on speed dial — "call Panel A when something changes." When you photocopy the ViewModel, the photocopy still has Panel A's phone number on speed dial. Now the clone calls Panel A every time it changes, confusing everyone.

It took 2 hours to diagnose because the symptom (Panel A updating when you edit Panel B) seemed like a data-binding bug, not a cloning bug. Nobody thought to check whether the event subscribers were shared.

Original ViewModel Name = "Panel A" Cloned ViewModel Name = "Panel B" SHARED Delegate PropertyChanged Panel A UI UpdatePanelA() fires Panel A's handler! calls Editing Panel B updates Panel A!
EventBug.cs
public class ViewModel : INotifyPropertyChanged
{
    private string _name = "";
    public event PropertyChangedEventHandler? PropertyChanged;

    public string Name
    {
        get => _name;
        set { _name = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(nameof(Name))); }
    }

    // ❌ MemberwiseClone copies the event's delegate list
    public ViewModel Clone() => (ViewModel)MemberwiseClone();
}

var original = new ViewModel { Name = "Panel A" };
original.PropertyChanged += (s, e) => UpdatePanelA(e);  // UI binding

var clone = original.Clone();
// clone.PropertyChanged STILL points to UpdatePanelA!
clone.Name = "Panel B";  // ❌ Fires UpdatePanelA — wrong panel updates!

Walking through the buggy code: The ViewModel has a PropertyChanged event. When Panel A's UI binds to the original, it subscribes a handler (UpdatePanelA) to this event. MemberwiseClone() copies every field, including the delegate backing the event. The clone now has the same subscriber list. When clone.Name = "Panel B" runs, it invokes PropertyChanged, which calls UpdatePanelA — the wrong panel's refresh method. The clone has no idea it's talking to the wrong audience.

EventFix.cs
public class ViewModel : INotifyPropertyChanged, IPrototype<ViewModel>
{
    private string _name = "";
    public event PropertyChangedEventHandler? PropertyChanged;

    public string Name
    {
        get => _name;
        set { _name = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(nameof(Name))); }
    }

    public ViewModel DeepClone()
    {
        var clone = (ViewModel)MemberwiseClone();
        clone.PropertyChanged = null; // ✅ Do NOT copy subscribers
        // The new UI panel will subscribe its own handlers when binding
        return clone;
    }
}

Why the fix works: After cloning, we explicitly set the event field to null, wiping out the inherited subscriber list. The clone starts with a clean slate — no event subscribers at all. When Panel B's UI binds to this clone, it subscribes its own handler (UpdatePanelB). Now each panel's ViewModel talks only to its own UI. Changing Panel B only updates Panel B.

How to Spot This in Your Code

Search for any class that has both an event declaration and a Clone()/MemberwiseClone() method. Events in C# are backed by delegate fields that get shallow-copied. Also watch for INotifyPropertyChanged, INotifyCollectionChanged, or any custom event pattern on cloneable classes. After cloning, always null out event fields.

Lesson Learned

Events (delegates) are reference-type fields — MemberwiseClone() copies the delegate list. Cloned objects must NEVER inherit event subscribers from the original. Always create a fresh instance and let the new context subscribe its own handlers.

The Incident

Thursday 3 PM. An admin dashboard lets training managers "duplicate" a course. The clone button works great in development (auto-increment IDs, fresh database). But in production, saving the clone either throws a duplicate key violationSQL error when you INSERT a row with a primary key that already exists. In EF Core: Microsoft.Data.SqlClient.SqlException: Violation of PRIMARY KEY constraint 'PK_Courses'. Cannot insert duplicate key in object 'dbo.Courses'. or — even worse — silently overwrites the original course.

The clone method copies every field from the original course, including the primary key Id. When you clone Course 42, the clone also has Id = 42. When you try to save this clone to the database, the database sees "you're inserting a new row with Id = 42, but Id = 42 already exists" and throws an error.

But the silent-overwrite scenario is scarier. With some ORM configurations, if the change tracker sees an entity with an existing ID and you call Add(), it might treat it as an update instead of an insert. The original Course 42 gets overwritten with the clone's data. No error. No warning. A student logs in and finds their course content has changed to the cloned version. Data loss discovered by accident.

The same problem extends to child entities. Each course has Modules, and each Module has its own Id. If you clone the course and its modules but keep the same module IDs, saving creates a mess of duplicate keys throughout the entire entity tree.

Think of it like making a photocopy of your passport. The copy has your same passport number, your same photo, your same birthdate. If someone tries to use both passports at the airport, the system flags a duplicate. A clone needs a NEW identity — new passport number, new issue date, everything that makes it uniquely "its own thing."

Original Course Id = 42 Title = "C# Basics" Cloned Course Id = 42 ✗ SAME! Title = "C# Basics (Copy)" Database (Courses table) Id=42 "C# Basics" (exists) INSERT Id=42 ??? CONFLICT! PK_Courses violation OR silent overwrite db.Add(clone)
DuplicateKeyBug.cs
public class Course
{
    public int Id { get; set; }              // ❌ cloned with the original's ID
    public string Title { get; set; } = "";
    public List<Module> Modules { get; set; } = new();
    public DateTime CreatedAt { get; set; }

    public Course Clone() => new()
    {
        Id = this.Id,                        // ❌ SAME ID as original!
        Title = $"{this.Title} (Copy)",
        Modules = this.Modules.Select(m => m.Clone()).ToList(),
        CreatedAt = this.CreatedAt           // ❌ same timestamp too
    };
}

var original = db.Courses.Find(42);          // Id = 42
var clone = original.Clone();                // Id = 42 still!
db.Courses.Add(clone);
await db.SaveChangesAsync();                 // ❌ DUPLICATE KEY or overwrites original

Walking through the buggy code: The Clone() method manually copies every field, including Id = this.Id. For a course with Id = 42, the clone also gets Id = 42. When EF Core sees db.Courses.Add(clone), it tries to INSERT a row with Id = 42 into a table that already has a row with Id = 42. The CreatedAt timestamp is also copied, making it look like the clone was created at the same time as the original — misleading for auditing.

IdentityResetFix.cs
public class Course : IPrototype<Course>
{
    public int Id { get; set; }
    public string Title { get; set; } = "";
    public List<Module> Modules { get; set; } = new();
    public DateTime CreatedAt { get; set; }

    public Course DeepClone() => new()
    {
        // ✅ Id = 0 → EF Core treats as new entity, auto-generates ID
        Title = $"{this.Title} (Copy)",
        Modules = this.Modules.Select(m => m.DeepClone()).ToList(),
        CreatedAt = DateTime.UtcNow          // ✅ fresh timestamp
    };
}

// Also reset child IDs
public class Module : IPrototype<Module>
{
    public int Id { get; set; }
    public string Name { get; set; } = "";

    public Module DeepClone() => new()
    {
        // ✅ Id = 0 → new entity
        Name = this.Name
    };
}

Why the fix works: The fix simply omits the Id field from the clone constructor. In C#, an unset int defaults to 0. EF Core treats an entity with Id = 0 as a brand-new entity and lets the database auto-generate a fresh ID. The CreatedAt is set to DateTime.UtcNow so the clone has its own creation timestamp. Crucially, the child Module entities also reset their IDs — every level of the entity tree needs a fresh identity.

How to Spot This in Your Code

Search for any DeepClone() or Clone() method that copies an Id field. Also check for: Guid primary keys (need Guid.NewGuid()), CreatedAt/UpdatedAt timestamps, RowVersion concurrency tokens, and CreatedBy audit fields. All of these represent identity or metadata about the specific entity, not transferable data. A good practice is to create a unit test that clones an entity and asserts clone.Id == 0.

Lesson Learned

ALWAYS reset identity fields (IDs, GUIDs, timestamps, audit trails) when cloning entities that will be persisted. A clone is a NEW entity — it must have a NEW identity. Make identity reset part of the DeepClone() contract, not an afterthought.

The Incident

Saturday 1 AM. A multiplayer game server starts getting reports of bizarre enemy behavior during peak hours. Goblins spawn with the health of an orc. Orcs appear with zero attack power. Some enemies have stats from completely different enemy types, mixed and matched in impossible combinations. The game is chaos.

The game server has a prototype registry — a dictionary that stores one "template" enemy of each type (goblin, orc, dragon, etc.). When the server needs to spawn a new enemy, it grabs the prototype and calls DeepClone(). This works perfectly with a few players. But during Saturday night peak load, 500+ enemies spawn per second across dozens of threads.

The root problem is twofold. First, a regular Dictionary<string, GameEntity> is not thread-safe. When one thread reads from the dictionary while another thread writes to it (updating a prototype), the internal data structure can get corrupted. The reader might get a partially-updated reference — reading half of the old data and half of the new data, creating a Frankenstein entity.

Second, if the DeepClone() method itself has any temporary state (like swapping internal buffers or using a shared temp variable), multiple threads calling it simultaneously on the same prototype can interfere with each other. Thread A starts cloning the goblin, Thread B starts cloning the same goblin half a nanosecond later, and they step on each other's intermediate state.

Think of it like two people trying to photocopy the same document at the same time on the same copier. If person A is mid-copy and person B swaps the document on the glass, person A's copy ends up with half of one document and half of another. The solution is either to use separate copiers (independent cloning) or to ensure only one person uses the copier at a time (thread safety).

It took 6 hours to diagnose because the corruption only happened under peak load — a thread-sanitizerA tool that detects data races in concurrent code. .NET doesn't have a built-in thread sanitizer like C++ TSan, but you can use stress tests with Environment.ProcessorCount * 4 threads, or tools like JetBrains dotTrace to detect contention. stress test running dozens of threads finally reproduced the corruption consistently.

Thread A (Read) Thread B (Write) Thread C (Read) Dictionary (unsafe!) "goblin" → Goblin proto "orc" → Orc proto Concurrent access = corruption Corrupted Clones Goblin + Orc stats mixed Zero health enemies Negative attack values ⚡ RACE CONDITION
RaceConditionBug.cs
public class EnemySpawner
{
    private readonly Dictionary<string, GameEntity> _prototypes = new();

    public EnemySpawner()
    {
        _prototypes["goblin"] = LoadEnemyConfig("goblin");
        _prototypes["orc"] = LoadEnemyConfig("orc");
    }

    public GameEntity Spawn(string type)
    {
        var proto = _prototypes[type];
        // ❌ Multiple threads read the prototype simultaneously
        // If Clone() temporarily mutates the prototype (e.g., swapping internal buffers),
        // or if another thread is modifying the prototype registry...
        return proto.DeepClone();
    }

    // ❌ Another thread could call this while Spawn is reading
    public void UpdatePrototype(string type, GameEntity updated)
    {
        _prototypes[type] = updated;  // ❌ Dictionary is not thread-safe
    }
}

Walking through the buggy code: The EnemySpawner uses a regular Dictionary to store prototypes. Multiple game threads call Spawn() simultaneously, each reading from the dictionary and calling DeepClone() on the same prototype. Meanwhile, UpdatePrototype() can be called by a configuration reload thread, writing to the dictionary while others are reading. Dictionary is not designed for concurrent access — reading while writing can cause internal array corruption, null references, or returning the wrong value entirely.

ThreadSafeFix.cs
public class EnemySpawner
{
    // ✅ ConcurrentDictionary for thread-safe reads/writes
    private readonly ConcurrentDictionary<string, GameEntity> _prototypes = new();

    public EnemySpawner()
    {
        _prototypes["goblin"] = LoadEnemyConfig("goblin");
        _prototypes["orc"] = LoadEnemyConfig("orc");
    }

    public GameEntity Spawn(string type)
    {
        if (!_prototypes.TryGetValue(type, out var proto))
            throw new ArgumentException($"Unknown: {type}");

        // ✅ DeepClone creates a fully independent copy
        // The prototype itself is never mutated during cloning
        return proto.DeepClone();
    }

    // ✅ Atomic replacement — readers see old or new, never partial
    public void UpdatePrototype(string type, GameEntity updated)
    {
        _prototypes.AddOrUpdate(type, updated, (_, _) => updated);
    }
}

Why the fix works: ConcurrentDictionary is specifically designed for multi-threaded access. Its TryGetValue() method is lock-free for reads, meaning hundreds of threads can read simultaneously without blocking each other. Writes use fine-grained (striped) locking that only blocks other writes to the same bucket — readers are never blocked. The AddOrUpdate() call atomically replaces the prototype, so readers always see either the old value or the new value, never a half-written one. And because the DeepClone() method only reads from the prototype (never mutates it), multiple threads can safely clone the same prototype at the same time.

How to Spot This in Your Code

If your prototype registry is a plain Dictionary or List and it's accessed from multiple threads (any Singleton service in ASP.NET Core, any shared registry in a game server, any static cache), it's a ticking time bomb. Search for Dictionary< in Singleton-registered services. Also check whether your DeepClone() method mutates ANY state on the prototype — even temporary state like a buffer swap is dangerous under concurrency.

Lesson Learned

Prototype registries in multi-threaded environmentsAny scenario where multiple threads access the same data concurrently. In ASP.NET Core, every HTTP request runs on a different thread from the ThreadPool. A Singleton registry is accessed by all requests simultaneously. Without thread-safe collections, concurrent reads during writes cause corrupted data, lost updates, or exceptions. must use ConcurrentDictionary or immutable snapshotsReplace the entire collection atomically instead of modifying in place. Using Volatile.Read/Write or ImmutableDictionary, readers always see a consistent version — either the old or the new, never a half-updated state. This is the "copy-on-write" approach to thread safety.. The DeepClone() method must NEVER mutate the prototype — it should only READ fields and create new objects. If your clone method has any temp state, it's not thread-safe.

Section 13

Pitfalls & Anti-Patterns

Mistake: Using MemberwiseClone() without checking if all fields are safe to shallow-copy.

Why This Happens: Developers reach for MemberwiseClone() because it's fast, built-in, and requires zero effort — one line of code copies everything. It feels like the "right" way to clone because it's literally provided by the framework. The problem is that MemberwiseClone() copies fields at face value: for a List<OrderLine>, it copies the reference (the pointer to the list), not the list itself. So the original and clone share the same list. It's like photocopying a sticky note that says "see folder on shelf 3" — both copies point to the same physical folder.

This trap is especially sneaky because it works perfectly for simple objects (only value types and strings). Developers test with a simple case, see it works, and assume it works for everything. The bug only surfaces when someone mutates a collection or nested object on the clone and the original silently changes too.

MemberwiseClone (Shallow) Original Clone List<OrderLine> Same list — mutate one, both change! Deep Clone (Safe) Original Clone List A List B (copy) Independent lists — fully isolated
ShallowTrap.cs
// ❌ DO NOT DO THIS — any reference-type field becomes shared
public Order Clone() => (Order)MemberwiseClone();
// Order.Lines (List<OrderLine>) is now shared between original and clone

The connection: The bad code trusts MemberwiseClone() blindly. The fix is to audit every field in your class: value types (int, decimal, DateTime) and immutable types (string) are safe. Everything else — List<T>, Dictionary<K,V>, arrays, custom classes — needs explicit deep copying with new List<T>(this.Lines) or .Select(x => x.DeepClone()).ToList().

Mistake: Implementing ICloneable because "it's in the framework."

Why This Happens: When you type "clone" into an IDE, IntelliSense helpfully suggests ICloneable. It's an official .NET interface, so it feels like the right thing to do. But ICloneable was designed in .NET 1.0, before generics existed. Its Clone() method returns object, which means every caller must cast the result — and if they cast to the wrong type, the error only shows up at runtime. Worse, the interface says nothing about whether the clone should be deep or shallow. Different .NET classes implement it differently, and callers have no way to know which they're getting.

Microsoft's own Framework Design Guidelines by Cwalina and Abrams explicitly state: "Do not implement ICloneable." The interface is considered a design mistake that's kept around only for backward compatibility.

ICloneable.Clone() Returns: object (Settings)obj.Clone() Unsafe cast + deep/shallow unknown IPrototype<T>.DeepClone() Returns: Settings obj.DeepClone() Type-safe + explicit deep contract
ICloneableBroken.cs
// ❌ ICloneable — returns object, no deep/shallow guidance
public class Settings : ICloneable
{
    public object Clone() => MemberwiseClone(); // shallow? deep? caller can't tell
}
var copy = (Settings)original.Clone(); // unsafe cast required
TypedPrototype.cs
// ✅ Generic interface — type-safe, explicit about deep copy
public interface IPrototype<out T> where T : class { T DeepClone(); }

public class Settings : IPrototype<Settings>
{
    public Settings DeepClone() => new() { /* ... */ }; // returns Settings, not object
}
Settings copy = original.DeepClone(); // no cast needed!

The connection: The bad code uses ICloneable and forces callers to cast from object — a runtime gamble. The good code uses a generic IPrototype<T> where the return type is baked into the interface. The compiler catches type mismatches at build time, the method name DeepClone() makes the contract explicit, and no casting is needed.

Mistake: Deep-cloning the top-level object but assigning collection references directly.

Why This Happens: When writing a manual DeepClone() method, developers carefully copy simple properties like Name and Price. When they get to a collection property like Tags, they write Tags = this.Tags without thinking twice — it looks the same as copying a string. But strings are immutable (can't be changed after creation), so sharing a string reference is perfectly safe. A List<string>, however, IS mutable — you can add, remove, and modify items. Sharing a list reference means both objects see each other's changes.

There's also a subtle two-level depth issue: even if you create a new list with new List<T>(this.Items), if T is a reference type (like a custom OrderLine class), the list is new but the items inside are still shared. You need to clone each item too.

Tags = this.Tags Original Clone List<string> clone.Tags.Add() changes original! new List<string>(this.Tags) Original Clone List A List B Independent — safe to mutate
ForgotCollection.cs
// ❌ Name is deep-copied (string is immutable), but Tags is shared
public Product DeepClone() => new()
{
    Name = this.Name,
    Tags = this.Tags          // ❌ same List reference!
};
ClonedCollection.cs
// ✅ New list with same elements (safe for List<string> since strings are immutable)
Tags = new List<string>(this.Tags),

// ✅ For List<T> where T is mutable — clone each element too
Items = this.Items.Select(i => i.DeepClone()).ToList()

The connection: The bad code writes Tags = this.Tags which copies the pointer, not the data. The good code writes new List<string>(this.Tags) which creates a brand-new list populated with the same strings. For lists of complex objects, each item also needs its own DeepClone() call — it's turtles all the way down.

Mistake: Cloning an object that holds streamsSystem.IO.Stream and its subclasses (FileStream, MemoryStream, NetworkStream) represent I/O resources. They maintain internal position counters, buffers, and OS handles. Cloning a stream via MemberwiseClone creates two objects sharing one OS handle — disposing one closes the handle for both., database connections, or HTTP clients.

Why This Happens: Some objects hold "resources" — things like file handles, network connections, or database connections that are limited and managed by the operating system. These aren't just data in memory that can be copied. They're like keys to a locked room: copying the key doesn't create a second room. Both keys open the same door. If one person returns the key (disposes the resource), the other person's key is now useless — the room is locked and nobody can get back in.

Developers fall into this trap when a class mixes data and resources. The data parts (file name, settings, configuration) are perfectly safe to clone. But the resource parts (open FileStream, active SqlConnection) cannot be meaningfully duplicated via memory copy.

Clone Shares Stream Original Clone FileStream Dispose original Clone explodes: ObjectDisposedException Clone Config, Not Resources Config (data) Config (cloned) Stream A Stream B (new) Each instance owns its own stream
DisposableTrap.cs
// ❌ FileProcessor holds a Stream — cloning shares the stream
public class FileProcessor : IDisposable
{
    private readonly FileStream _stream;
    public FileProcessor Clone() => (FileProcessor)MemberwiseClone();
    // Both original and clone now reference the SAME stream
    // Disposing one closes the stream for both → ObjectDisposedException
}
SeparateDataFromResources.cs
// ✅ Separate clonable data from non-clonable resources
public record FileProcessorConfig(string FilePath, Encoding Encoding);  // ← clone this

public class FileProcessor : IDisposable
{
    private readonly FileStream _stream;  // ← NOT clonable
    public FileProcessorConfig Config { get; }

    public FileProcessor(FileProcessorConfig config)
    {
        Config = config;
        _stream = File.OpenRead(config.FilePath);  // each instance opens its own stream
    }
}

The connection: The bad code clones the whole object, sharing the stream. The good code separates data (config) from resources (stream). Clone the config, then create a new FileProcessor from the cloned config — each instance opens its own stream. If a class implements IDisposable, that's a strong signal it shouldn't be cloned directly.

Mistake: Assuming cloning is faster than new without benchmarking.

Why This Happens: The Prototype pattern is often taught as a "performance optimization" — skip expensive construction by cloning a pre-built object. This is true when construction involves database queries, file I/O, or network calls (things that take milliseconds). But developers sometimes apply this logic to simple data objects where construction is just setting a few properties (nanoseconds). In that case, new Product { Name = "Widget", Price = 9.99m } is already blazingly fast — there's nothing to optimize.

MemberwiseClone() itself is very fast (~2-5 nanoseconds). But a manual deep clone that creates new lists, dictionaries, and nested objects allocates a lot of memory and can actually be slower than constructing from scratch. Serialization-based cloning (JSON round-trip) is dramatically slower — 100-1000x. Using Prototype for performance without measuring is cargo-cult programming.

Performance Reality Check new Product{...} ~5 ns (fast) MemberwiseClone ~3 ns (fastest) Manual DeepClone ~50-500 ns (allocations) JSON round-trip ~5000 ns
BenchmarkFirst.cs
// ❌ Premature optimization — cloning a simple POCO
var clone = template.DeepClone();  // creates new lists, dicts, nested objects

// ✅ For simple objects, just use new — it's already fast
var product = new Product { Name = "Widget", Price = 9.99m };

// ✅ Only use Prototype when construction is genuinely expensive:
// - Loading from database (ms), parsing config files (ms), network calls (ms)
// - Complex computed state that takes significant CPU time

The connection: Prototype shines when the setup is expensive, not when the object is complex. If you can construct the object from scratch in microseconds, just use new. Reserve Prototype for objects whose initial state comes from slow external sources. And always measure with BenchmarkDotNetThe standard .NET micro-benchmarking library. It handles warmup, statistical analysis, GC pressure measurement, and memory allocation tracking. Always use it for performance claims — never trust a Stopwatch in a loop. before claiming a performance benefit.

Mistake: Having Clone() return a base class or interface type instead of the concrete type.

Why This Happens: Developers define Clone() on the base class (returning Shape), thinking it's more "polymorphic." But this forces every caller to cast the result: var circle = (Circle)shape.Clone(). Casts are dangerous — if someone passes an Ellipse where a Circle was expected, the cast throws at runtime. The compiler can't help you because the return type is just Shape. C# 9 introduced covariant return types specifically to solve this — an overriding method can return a more specific type than the base method declares.

Clone() returns Shape circle.Clone() (Circle)result Runtime cast — can throw! DeepClone() returns Circle circle.DeepClone() Circle (no cast) Compile-time safe
WrongReturnType.cs
// ❌ Returns Shape, not Circle — caller must cast
public class Circle : Shape
{
    public double Radius { get; set; }
    public Shape Clone() => new Circle { X = X, Y = Y, Radius = Radius };
}
var copy = (Circle)shape.Clone();  // ❌ runtime cast — could throw!
TypedReturn.cs
// ✅ Returns Circle — no cast needed, type-safe at compile time
public class Circle : Shape, IPrototype<Circle>
{
    public double Radius { get; set; }
    public Circle DeepClone() => new() { X = X, Y = Y, Radius = Radius };
}
Circle copy = original.DeepClone();  // ✅ compiler knows it's a Circle

The connection: The bad code returns Shape, forcing callers to guess (and cast) the actual type. The good code uses IPrototype<Circle> so DeepClone() returns Circle directly. No cast, no guessing, no runtime surprises. The type system does the work for you.

Mistake: Storing mutable prototypes in a registry without protection — someone modifies the template directly instead of cloning first.

Why This Happens: A prototype registry stores template objects that serve as the "master copy" for cloning. The natural instinct is to write a GetPrototype() method that returns the template directly. But if the registry hands out the original object (not a clone), any caller can accidentally modify the template. Imagine a library lending out its only copy of a book — if someone scribbles in the margins, every future borrower sees the scribbles. The library should lend out photocopies, never the original.

This bug is particularly insidious because the corruption is delayed. Someone sets goblin.Health = 999 during testing, and the next 500 goblins all spawn with 999 health. The person who mutated the prototype is long gone from that code path, making the bug extremely hard to trace back to its source.

GetPrototype() — Exposed Goblin Template Caller: goblin.Health = 999 Template corrupted! All future clones get 999 HP Create() — Returns Clone Goblin Template Goblin Template Clone (independent) Template untouched forever
MutableRegistry.cs
// ❌ Registry exposes prototypes directly
public GameEntity GetPrototype(string type) => _prototypes[type];

// Caller accidentally mutates the prototype!
var goblin = registry.GetPrototype("goblin");
goblin.Stats.Health = 999;  // ❌ Now EVERY future clone has 999 health
SafeRegistry.cs
// ✅ Registry only exposes clones — prototype is never directly accessible
public GameEntity Create(string type) => _prototypes[type].DeepClone();

// Caller gets an independent copy — mutating it is perfectly safe
var goblin = registry.Create("goblin");
goblin.Stats.Health = 999;  // ✅ Only THIS goblin has 999 health

The connection: The bad code exposes the template directly via GetPrototype(). The good code only exposes a Create() method that returns a clone. The template is an internal implementation detail that no caller can touch. Even a careless developer can't accidentally corrupt the master copy because they never have direct access to it.

Mistake: Assuming original.Equals(clone) works after cloning.

Why This Happens: After cloning an object, you'd expect the original and clone to be "equal" since they have identical data. But in C#, the default Equals() on classes uses reference equality — it checks whether two variables point to the exact same object in memory, not whether they have the same values. Since a clone is a different object (at a different memory address), original.Equals(clone) is always false, even though every single property matches.

This becomes a real problem when you use collections that depend on equality: HashSet<T>.Contains(clone) returns false, list.Distinct() treats clones as different items, and dictionary lookups by object key fail. You think the clone is "the same thing," but C# disagrees.

Reference Equality (class) Addr: 0x01 Addr: 0x02 Same data, different address Equals() = false Value Equality (record) Name: Widget Name: Widget = Same data = equal Equals() = true
EqualityTrap.cs
// ❌ Class with default (reference) equality
var clone = original.DeepClone();
original.Equals(clone)   // false! Different objects in memory
hashSet.Contains(clone)  // false! HashSet uses GetHashCode() + Equals()

// ✅ Use records — they get value equality for free
public record Product(string Name, decimal Price);
var clone = original with { };
original.Equals(clone)   // true! Records compare by property values

The connection: If your cloned objects need to participate in equality checks (collections, deduplication, comparisons), either override Equals() and GetHashCode() to compare by value semanticsTwo objects are "equal" if their property values match, regardless of whether they're the same instance in memory. Records get this for free (compiler-generated). For classes, you must override Equals() to compare properties, and GetHashCode() to return consistent hash codes for equal objects., or use C# records which generate value-based equality automatically.

Mistake: Using JSON serialization for cloning when some properties are marked [JsonIgnore].

Why This Happens: Serialization-based cloning (serialize to JSON, then deserialize back) is incredibly convenient — it handles nested objects, collections, and complex graphs with zero manual code. Developers reach for it as a "universal deep clone" solution. The trap is that serializers only see properties that are configured for serialization. Properties marked [JsonIgnore] (hidden from API responses for security reasons, like password hashes or internal tokens) are also hidden from the clone. The serializer doesn't know you're cloning — it just does its job of ignoring what you told it to ignore.

Private fields, calculated properties, and properties with custom converters are also at risk. If the serializer can't see it, it can't clone it. The data is silently lost — no error, no warning.

JSON Clone Drops [JsonIgnore] Fields Original Name = "Alice" PwdHash = "x7f..." Avatar = [bytes] Serialize JSON "Name": "Alice" PwdHash: GONE Deserialize Clone Name = "Alice" PwdHash = "" Avatar = empty Data silently lost — no error, no warning
JsonIgnoreTrap.cs
public class UserProfile
{
    public string Name { get; set; } = "";
    [JsonIgnore] public string PasswordHash { get; set; } = "";  // ← ignored!
    [JsonIgnore] public byte[] Avatar { get; set; } = Array.Empty<byte>(); // ← ignored!
}

// JSON clone silently drops ignored properties!
var clone = JsonSerializer.Deserialize<UserProfile>(
    JsonSerializer.Serialize(original));
// clone.PasswordHash is "" — data lost!
ManualClonePreferred.cs
// ✅ Manual clone copies ALL fields regardless of serialization attributes
public UserProfile DeepClone() => new()
{
    Name = this.Name,
    PasswordHash = this.PasswordHash,   // ✅ copied explicitly
    Avatar = this.Avatar.ToArray()       // ✅ new array, all data preserved
};

The connection: Serialization attributes exist to control API responses, not cloning. If you use serialization for cloning, those API-focused attributes leak into your cloning logic, silently dropping fields. Use manual DeepClone() for types with serialization attributes, or create separate JsonSerializerOptions that include all properties (but this gets messy fast). Manual cloning is clearer and safer.

Mistake: Cloning a service that has DI-injected dependencies (logger, repository, HttpClient).

Why This Happens: In ASP.NET Core, the DI containerMicrosoft.Extensions.DependencyInjection — the built-in dependency injection container in ASP.NET Core. It manages object creation, lifetime tracking (Transient = new per request, Scoped = one per HTTP request, Singleton = one for the app), and disposal. Cloned objects bypass this management entirely. manages object lifetimes carefully. A ScopedOne instance per HTTP request / scope. Created when the scope starts, disposed when it ends. Ideal for DbContext, per-request state. service like DbContext is created at the start of an HTTP request and disposed at the end. A TransientA new instance every time the service is requested from the container. No reuse, no sharing. service gets a fresh instance each time. When you clone a service, the DI container has no idea the clone exists. It doesn't track the clone's lifetime, doesn't dispose it, and the clone might hold onto Scoped dependencies that have already been disposed.

Think of it like this: the DI container is a hotel front desk that tracks which rooms are occupied. Cloning a guest (service) without telling the front desk means the hotel doesn't know there's a second person in the building. When checkout time comes, they only evict the registered guest — the clone stays behind, using resources nobody is tracking.

Clone a Service DI Container Service (tracked) Clone (ghost!) DI won't dispose the clone Clone Data, Not Services OrderData clone this OrderProcessor DI resolves Services from DI, data cloned
DiCloneTrap.cs
// ❌ Cloning a service with injected dependencies
public class OrderProcessor
{
    private readonly ILogger _logger;           // injected by DI
    private readonly IOrderRepository _repo;    // injected by DI

    public OrderProcessor Clone() => (OrderProcessor)MemberwiseClone();
    // The clone shares the SAME logger and repo instances
    // DI container doesn't know about this clone — lifetime tracking is broken
}
CloneDataNotServices.cs
// ✅ Only clone DATA objects — never services
public record OrderData(string Customer, List<LineItem> Items);  // ← clone this

public class OrderProcessor  // ← resolve from DI, never clone
{
    private readonly ILogger _logger;
    private readonly IOrderRepository _repo;

    public void Process(OrderData data) { /* ... */ }
}

The connection: Prototype is for data objects — DTOs, entities, value objects, configuration. Services with injected dependencies should NEVER be cloned. If you need multiple instances of a service, register it as Transient in the DI container and let the container create each instance properly. The rule of thumb: if a class receives dependencies through its constructor, it's a service and shouldn't be cloned.

Section 14

Testing Strategies

Strategy 1: Verify Deep Independence (Mutate Clone, Assert Original Unchanged)

The most critical test: prove that modifying a clone does NOT affect the original. Mutate every mutable field on the clone and assert the original's values are unchanged.

DeepIndependenceTest.cs
[Fact]
public void DeepClone_ModifyingClone_DoesNotAffectOriginal()
{
    // Arrange
    var original = new Document
    {
        Title = "Original",
        Sections = { new Section { Heading = "Intro", Content = "Hello" } },
        Metadata = { ["key"] = "value" }
    };

    // Act
    var clone = original.DeepClone();
    clone.Title = "Modified";
    clone.Sections[0].Heading = "Changed";
    clone.Sections.Add(new Section { Heading = "New" });
    clone.Metadata["key"] = "different";
    clone.Metadata["extra"] = "added";

    // Assert — original is completely untouched
    Assert.Equal("Original", original.Title);
    Assert.Single(original.Sections);
    Assert.Equal("Intro", original.Sections[0].Heading);
    Assert.Equal("value", original.Metadata["key"]);
    Assert.False(original.Metadata.ContainsKey("extra"));
}

[Fact]
public void DeepClone_ListElements_AreIndependent()
{
    var original = new Document
    {
        Sections = { new Section { Heading = "A" }, new Section { Heading = "B" } }
    };

    var clone = original.DeepClone();

    // Verify each section is a different object instance
    Assert.NotSame(original.Sections[0], clone.Sections[0]);
    Assert.NotSame(original.Sections[1], clone.Sections[1]);
}

If your object graph has circular references, verify that cloning terminates and the circular structure is preserved in the clone.

CircularRefTest.cs
[Fact]
public void DeepClone_CircularReference_DoesNotStackOverflow()
{
    // Arrange: A → B → A (circular)
    var ceo = new Employee { Name = "CEO" };
    var vp = new Employee { Name = "VP", Manager = ceo };
    ceo.DirectReports.Add(vp);

    // Act — should NOT throw StackOverflowException
    var clonedCeo = ceo.DeepClone();

    // Assert
    Assert.Equal("CEO", clonedCeo.Name);
    Assert.Single(clonedCeo.DirectReports);
    Assert.Equal("VP", clonedCeo.DirectReports[0].Name);

    // The circular reference is preserved in the clone
    Assert.Same(clonedCeo, clonedCeo.DirectReports[0].Manager);

    // But the clone graph is independent from the original
    Assert.NotSame(ceo, clonedCeo);
    Assert.NotSame(vp, clonedCeo.DirectReports[0]);
}

If you chose Prototype for performance reasons, prove it. Use BenchmarkDotNet to compare clone time vs constructor time.

CloneBenchmark.cs
[MemoryDiagnoser]
public class CloneBenchmarks
{
    private readonly Document _prototype;

    public CloneBenchmarks()
    {
        _prototype = BuildExpensiveDocument(); // simulates 200ms setup
    }

    [Benchmark(Baseline = true)]
    public Document ConstructFromScratch() => BuildExpensiveDocument();

    [Benchmark]
    public Document DeepCloneManual() => _prototype.DeepClone();

    [Benchmark]
    public Document DeepCloneJson() => _prototype.JsonDeepClone();

    [Benchmark]
    public Document ShallowClone() => _prototype.ShallowClone();

    private static Document BuildExpensiveDocument()
    {
        Thread.Sleep(1); // simulate I/O
        return new Document
        {
            Title = "Report",
            Sections = Enumerable.Range(0, 50)
                .Select(i => new Section { Heading = $"Section {i}" })
                .ToList()
        };
    }
}

// Expected results:
// | Method              | Mean        | Allocated |
// |---------------------|-------------|-----------|
// | ConstructFromScratch| ~1,200 μs   | ~8 KB     |  ← expensive setup
// | DeepCloneManual     | ~2 μs       | ~6 KB     |  ← 600x faster
// | DeepCloneJson       | ~45 μs      | ~24 KB    |  ← 25x faster, more allocs
// | ShallowClone        | ~5 ns       | ~0.2 KB   |  ← fastest but UNSAFE

Verify that the clone has the same property values as the original (except identity fields). Use FluentAssertionsA popular .NET assertion library that provides readable, fluent syntax: clone.Should().BeEquivalentTo(original, opts => opts.Excluding(x => x.Id)). The BeEquivalentTo method does deep structural comparison — much better than checking each property manually. for deep structural comparisonComparing two objects by their property values rather than by reference identity. Assert.Equal checks if two objects are the "same" by value. Assert.Same checks if they're the same instance in memory. For clone testing, you want structural equality (same values) but NOT referential identity (different instances)..

EqualityTest.cs
[Fact]
public void DeepClone_PreservesAllProperties_ExceptIdentity()
{
    var original = new Report
    {
        Department = "Engineering",
        Sections = { "Intro", "Metrics", "Summary" },
        Figures = { ["headcount"] = 42, ["revenue"] = 1_000_000m },
        Compliance = new ComplianceData { GdprChecked = true, Sox404Checked = true }
    };

    var clone = original.DeepClone();

    // Identity fields should be NEW
    Assert.NotEqual(original.Id, clone.Id);
    Assert.True(clone.Created >= original.Created);

    // Content fields should match
    clone.Should().BeEquivalentTo(original, opts => opts
        .Excluding(x => x.Id)
        .Excluding(x => x.Created));
}

[Theory]
[InlineData("goblin")]
[InlineData("orc")]
public void PrototypeRegistry_Spawn_ReturnsCorrectType(string enemyType)
{
    var entity = EnemyRegistry.Spawn(enemyType);

    Assert.NotNull(entity);
    Assert.NotEqual(Guid.Empty, entity.InstanceId);
    Assert.Equal(enemyType, entity.Name.ToLowerInvariant());
}
Section 15

Performance Considerations

Prototype's performance story varies dramatically depending on the cloning strategy. Here's what you need to know:

MethodApprox. TimeAllocationsUse Case
MemberwiseClone()~2-10 ns1 object (same size as original)Shallow copy of value-type-only objects
Manual deep clone~50-500 ns1 object + new collectionsBest balance of speed and correctness
Record with~5-20 ns1 object (shallow)Immutable records with value properties
MessagePack clone~500 ns-5 μsByte buffer + new objectComplex graphs, need binary speed
System.Text.Json clone~5-50 μsString buffer + new objectConvenience over performance
Newtonsoft.Json clone~10-100 μsString + reflection overheadLegacy codebases
Expression treeSystem.Linq.Expressions allows building code as data, then compiling it into a delegate at runtime. Libraries like DeepCloner analyze types via reflection once, build an expression tree for field-by-field copying, compile it, and cache the delegate. Near-native speed after initial compilation. clone~100-500 ns1 object + compiled delegatesLibrary-based (DeepCloner NuGet)
PerformanceBenchmark.cs
[MemoryDiagnoser]
[SimpleJob(RuntimeMoniker.Net80)]
public class CloneStrategyBenchmarks
{
    private GameEntity _prototype = null!;
    private JsonSerializerOptions _jsonOptions = null!;

    [GlobalSetup]
    public void Setup()
    {
        _prototype = new GameEntity
        {
            Name = "Goblin",
            Stats = new Stats { Health = 30, Attack = 8, Defense = 3, Speed = 1.2 },
            Ai = new AiConfig { Behavior = "patrol", AggroRange = 5, Abilities = { "slash", "dodge" } }
        };
        _jsonOptions = new JsonSerializerOptions
        {
            ReferenceHandler = ReferenceHandler.Preserve
        };
    }

    [Benchmark(Baseline = true)]
    public GameEntity ManualDeepClone() => _prototype.DeepClone();

    [Benchmark]
    public GameEntity JsonDeepClone()
    {
        var json = JsonSerializer.Serialize(_prototype, _jsonOptions);
        return JsonSerializer.Deserialize<GameEntity>(json, _jsonOptions)!;
    }

    [Benchmark]
    public GameEntity MemberwiseShallow() => _prototype.ShallowClone();

    [Benchmark]
    public GameEntity ConstructNew() => new()
    {
        Name = "Goblin",
        Stats = new Stats { Health = 30, Attack = 8, Defense = 3, Speed = 1.2 },
        Ai = new AiConfig { Behavior = "patrol", AggroRange = 5, Abilities = { "slash", "dodge" } }
    };
}
When Cloning Beats Construction: Prototype wins when the original object required expensive I/O to build (database queries, file reads, API calls). If the object is just new Thing { A = 1, B = 2 }, construction is already fast — cloning adds complexity without benefit. The decision metric: Is the prototype's setup cost > 10x the clone cost? If yes, Prototype is worth it.
Memory Tip: MemberwiseClone() allocates exactly one object the same size as the original — it's a raw memory copy via the CLR. Manual deep clone allocates 1 + N objects (one for the root, N for nested collections/objects). JSON clone allocates the JSON string + the deserialized object graph. For memory-sensitive paths (game loops, tight hot paths), manual deep clone gives you the best control.
Section 16

How to Explain in Interview

Your Script (90 seconds)

Opening: "Prototype is the GoF pattern for creating objects by cloning an existing instance instead of constructing from scratch. Think of it as a photocopier for objects — you have a fully-configured original, press copy, and get an independent duplicate."

Core: "The pattern has three parts: a Prototype interface with a Clone() method, concrete prototypes that know how to copy themselves, and optionally a registry that maps keys to prototype instances. The critical distinction is deep vs shallow cloning — shallow copy shares nested references, which is a major bug source."

Example: "In a recent project, we had a report generator where base reports took 200ms to construct — loading templates, formatting rules, compliance data. Instead of rebuilding from scratch for each department, we cloned a pre-built template and only changed department-specific fields. This cut report generation from 200ms to under 1ms per report."

When: "I reach for Prototype when object construction is expensive and I need many variations, when I want to decouple the client from concrete types, or when object types are determined at runtime. I avoid it for simple objects where new is fine, or for immutable types where sharing is safe."

Close: "In modern .NET, C# records with 'with' expressions are the language-level Prototype for simple cases. For complex object graphs, I use manual deep clone with a typed IPrototype<T> interface — never ICloneable, which Microsoft themselves recommend against."

Section 17

Interview Q&As

Easy

Think First What problem does Prototype solve that new doesn't?

Think about how a bakery works. To make 50 identical gingerbread cookies, you don't sculpt each one from scratch. You make one perfect cookie cutter (the prototype), then stamp out copies. Each copy starts identical to the original shape, and you can decorate them differently afterward. That's Prototype in a nutshell — create one well-configured object, then copy it whenever you need a similar one.

Prototype is a creational design pattern that creates new objects by copying (cloning) an existing instance. Instead of calling new ConcreteClass() and configuring from scratch every time, the client calls prototype.Clone() to get a pre-configured copy. The client doesn't even need to know what concrete class it's dealing with — it just says "give me another one like this."

This matters in two situations: (1) when setting up the object is expensive (loading config from a database, parsing files), so you only want to do it once, and (2) when the exact type isn't known at compile time (runtime plugins, dynamic type loading), so you can't write new SpecificType() in your code.

Great Answer Bonus "Mention that C# records with 'with' expressions are essentially compiler-generated Prototype — the compiler creates a hidden Clone method and copy constructor automatically."
Think First What happens to a List<T> field in each type of copy?

Imagine you have a folder with a document and a sticky note inside. Shallow copy is like photocopying just the folder — you get a new folder, but the document and sticky note inside are the same physical items. Move the sticky note in one folder, and it's gone from the other. Deep copy is like photocopying the folder AND everything inside it — you get a completely independent set. Nothing you do to one folder affects the other.

In code terms: shallow copy duplicates the top-level object but copies reference-type fields by reference. The original and clone share the same nested objects (lists, dictionaries, custom classes). Changing a nested object in the clone also changes it in the original. Deep copy duplicates everything — the top-level object AND all nested objects recursively. Original and clone are completely independent.

In C#: MemberwiseClone() is always shallow. For deep copies, you need manual field copying (new List<string>(this.Tags)), serialization round-trips, or copy constructors. The important nuance: strings and value types (int, DateTime) are safe to shallow-copy because they're either immutable or copied by value automatically. The danger zone is mutable reference types — List<T>, Dictionary<K,V>, arrays, and custom classes.

Great Answer Bonus "Note that strings and other immutable types are safe to shallow-copy because they can't be mutated — the 'danger zone' is mutable reference types like List, Dictionary, and custom classes."
Think First What information does the ICloneable.Clone() signature NOT tell you?

Imagine ordering food from a menu that just says "food" with no description. You don't know if you're getting a salad or a steak. That's ICloneable — it promises a Clone() method but tells you nothing useful about what you'll get back.

Two specific problems. First, Clone() returns object — callers must cast the result (e.g., (MyClass)thing.Clone()), and if they cast to the wrong type, the error only shows up at runtime, not at compile time. Second, the interface says nothing about whether the clone is deep or shallow. Different .NET classes implement it differently: Array.Clone() is shallow, but some classes do deep copies. A caller has no way to know without reading the source code.

This is why Microsoft's own Framework Design Guidelines explicitly state: "Do not implement ICloneable." The interface was created in .NET 1.0 before generics existed. If it were designed today, it would be ICloneable<T> with a T Clone() method. The modern alternative is a custom IPrototype<T> with a typed return and an explicit name like DeepClone() that leaves no ambiguity.

Great Answer Bonus "Point out that ICloneable predates generics (.NET 1.0) — if it were designed today, it would be ICloneable<T> with a T Clone() method."
Think First Why is it so fast (~2-5 nanoseconds)?

Think of MemberwiseClone() as a high-speed photocopier for objects. It doesn't read each page one by one — it grabs the entire stack and copies it all in one shot. That's why it's so fast (2-5 nanoseconds for small objects).

Technically, MemberwiseClone() is a protected method on System.Object (so every object in C# has it). It's implemented as a native CLR intrinsic — it allocates a new chunk of memory the same size as the original object and does a raw byte-for-byte memory copy. No constructor runs, no field-by-field assignment, just one block copy operation (essentially memcpy at the CLR level).

The result: value-type fields (int, decimal, bool) get independent copies. Reference-type fields (List<T>, custom classes) get their pointers copied — they point to the same object in memory. That's what makes it "shallow." It also means no constructor side effects fire — if your constructor logs "object created" or increments a counter, the clone bypasses all of that. It exists without the constructor ever knowing.

Great Answer Bonus "Mention that no constructor runs during MemberwiseClone — this can be surprising if your constructor has side effects (logging, validation, incrementing counters)."
Think First How would a client request a clone without knowing the concrete type?

Imagine a print shop with a filing cabinet full of document templates. You walk in and say "I need an invoice template." The clerk doesn't hand you the original template — they photocopy it and give you the copy. You fill in your details on the copy while the original stays pristine for the next customer. That filing cabinet is a Prototype Registry.

In code, a Prototype Registry is a centralized lookup (typically a Dictionary<string, IPrototype<T>>) that maps string keys to pre-configured prototype instances. The client asks the registry for a clone by name: registry.Create("goblin"). The registry finds the matching prototype, calls DeepClone(), and returns the independent copy.

The big advantage: the client is completely decoupled from concrete types. It never writes new Goblin() or new Orc(). It only knows about the registry and the prototype interface. New types can be added to the registry at runtime (loaded from config files, database, or plugins) without touching any client code. In a game server, you could add a new "dragon" enemy type by simply loading a new config entry — zero code changes, zero recompilation.

Great Answer Bonus "Mention that the registry should never expose the prototype directly — only expose Clone(). Otherwise callers can accidentally mutate the template."
Think First What does the 'with' expression actually generate under the hood?

C# records are the language saying: "Prototype is so useful, we'll bake it into the syntax." When you declare a record, the compiler secretly generates: (1) a protected copy constructor that does a memberwise copy, and (2) a hidden method called <Clone>$ that calls this copy constructor. You never see these — they're auto-generated.

The with expression is the user-friendly way to clone with modifications: var clone = original with { Name = "New" };. Under the hood, this compiles to roughly: "clone the original, then set Name on the clone." It's Prototype in one line of code, with zero boilerplate.

The caveat: records do shallow cloning. This is safe when all properties are immutable (other records, strings, value types) because immutable things can't be changed — sharing is harmless. But if a record has a mutable List<string> property, the with expression shares that list. The same shallow-copy trap as MemberwiseClone(), just with prettier syntax.

Great Answer Bonus "Distinguish record classes (reference types, shallow clone via with) from record structs (value types, deep copy by default because value types are always copied by value)."

Medium

Think First What are the trade-offs of each serialization format for cloning?

Serialize the object to a byte stream or string, then deserialize it back into a new instance. This creates a completely independent deep copy because deserialization constructs entirely new objects.

Three common approaches in modern .NET:

  • System.Text.Json — Built-in, handles circular refs with ReferenceHandler.Preserve. Medium speed. No attributes required on simple types.
  • MessagePack — Binary format, typically 3-10x faster than JSON. Requires [MessagePackObject] attributes. Best for hot paths.
  • protobuf-net — Google's Protocol Buffers for .NET. Very fast, compact. Requires [ProtoContract] attributes.

The trade-off: serialization-based cloning is 100-1000x slower than MemberwiseClone and requires all properties to be serializable. But it handles arbitrary object graphs with zero manual code.

Great Answer Bonus "Warn about [JsonIgnore] and private fields — serialization cloning only copies what the serializer sees. Properties marked with ignore attributes are silently dropped."
Think First What does Factory Method require that Prototype doesn't?

Choose Prototype when: (1) object construction is expensive and you want to avoid repeating it, (2) the number of product variants is large or dynamic (loading types at runtime), (3) you want to avoid a parallel hierarchy of factory classes — one for each product type.

Choose Factory Method when: (1) each variant has different construction logic (not just different data), (2) the creation process itself varies across types, (3) you want compile-time type safety via the class hierarchy.

In practice, Prototype avoids the "class explosion" problem — instead of 20 factory subclasses for 20 product types, you have 20 prototype instances in a registry.

Great Answer Bonus "Mention that Prototype and Factory Method can be combined — a factory method that clones a prototype and customizes it, getting the decoupling benefits of both."
Think First What data structure breaks infinite recursion during graph traversal?

Use a Dictionary<object, object> (with ReferenceEqualityComparer.InstanceA comparer that uses Object.ReferenceEquals for equality (is this the exact same memory address?) instead of the potentially-overridden Equals method. Critical for visited dictionaries because you need to detect if you've seen this EXACT OBJECT before, not if you've seen an "equal" object. Available since .NET 5.) as a "visited" map. Before cloning any object, check if it's already in the map — if yes, return the existing clone. If no, create the clone, add it to the map BEFORE recursing into children, then clone the children.

The critical detail: register the clone in the visited map BEFORE populating its fields. This ensures that when a child references back to the parent, it finds the already-created (but not yet fully populated) clone — breaking the cycle.

Alternatively, use serializers that handle cycles natively: System.Text.Json with ReferenceHandler.Preserve adds $id/$ref markers to track references.

Great Answer Bonus "Explain why you must use ReferenceEqualityComparer (compares by reference identity) instead of the default EqualityComparer (which may use overridden Equals and cause false matches)."
Think First What makes a database entity unique from the database's perspective?

These fields must be reset or regenerated in the clone:

  • Primary Key (Id) — Set to 0/default so the database generates a new one
  • CreatedAt / UpdatedAt timestamps — Should reflect clone creation time
  • Audit fields (CreatedBy, ModifiedBy) — Should reflect the current user
  • Navigation property IDs — Child entity IDs must also be reset (0) so EF Core treats them as new
  • Concurrency tokensA mechanism for optimistic concurrency control. EF Core uses a RowVersion column (SQL Server) or xmin (PostgreSQL) to detect conflicts. When saving, EF checks if the token matches the database. If someone else modified the row, the tokens won't match and EF throws DbUpdateConcurrencyException. A clone must have a fresh token to avoid false conflicts. (RowVersion) — Must be reset to avoid stale-data conflicts
  • Unique constraints (slug, email, SKU) — Append "(Copy)" or generate new values
Great Answer Bonus "Mention EF Core's AsNoTracking() for loading the source entity detached from the change tracker — otherwise EF thinks you're modifying the original."
Think First If an object can't be mutated, do you still need to clone it?

Immutability is Prototype's biggest competitor. If an object is truly immutable (all fields readonly, no mutable collections), you don't need to clone it — multiple consumers can safely share the same instance. Changing a property means creating a new instance with the changed value (which is what records with with do).

The relationship: Prototype solves the problem of "I need a similar but modified version of this object." Immutability solves the same problem differently — instead of "clone and mutate," you do "create new with changes." Records with with are the bridge: they use Prototype's mechanism (clone via copy constructor) to support immutability's semantics (new instance for every change).

Great Answer Bonus "Argue that immutable types with 'with' expressions have made the classic mutable Prototype pattern less necessary in modern C#. The pattern is still relevant for mutable objects with expensive construction, but the immutability-first approach is preferred."
Think First What concurrent operations need to be safe: reads, writes, or both?

Three approaches depending on access patterns:

  • ConcurrentDictionary — Thread-safe reads and writes. Best when prototypes are added/updated at runtime. The TryGetValue + DeepClone() pattern is naturally safe because each clone is an independent object.
  • ImmutableDictionary — Replace the entire dictionary atomically on update. Best when updates are rare and reads dominate (hot path spawning, cold path configuration).
  • Read-only dictionary at startup — Load all prototypes during initialization, store in a IReadOnlyDictionary. No concurrency concerns because it never changes. Best for static registries (game enemy types, document templates).

Critical: the DeepClone() method itself must be thread-safe — it should only READ from the prototype, never mutate it. If your clone method has any temporary state modifications, protect them with a lock or make the method pure.

Great Answer Bonus "Note that ConcurrentDictionary's TryGetValue is lock-free for reads — it uses a striped locking strategy that only locks on writes. So high-read, low-write registries get excellent throughput."
Think First Both involve "saving state." What's the difference in intent?

Prototype's intent: Create a new object by copying an existing one. The clone is used going forward — it becomes its own entity with its own lifecycle.

Memento's intent: Capture an object's internal state so it can be restored later. The memento is a snapshot saved for undo/rollback — it's not a new active entity.

They often use the same mechanism (cloning/serialization), but the purpose is different. Prototype says "I need a new object like this one." Memento says "I need to remember what this object looked like at this point in time." In practice, they complement each other: use Prototype to create the snapshot (clone), use Memento to store and manage it.

Great Answer Bonus "Give a concrete example: a document editor uses Prototype to deep-clone the document state before each edit (creating the snapshot), and Memento to manage the undo stack (storing and restoring those snapshots)."
Think First Manual "mutate and assert" tests only catch bugs you wrote tests for. How do you guard against properties you don't know about yet?

Write a reflection-based guard test that automatically discovers all reference-type properties and verifies they were deep-copied. This catches forgotten properties the moment they're added — no manual test updates needed:

  • Use reflection to get all public properties of the cloneable class
  • Filter to reference types (excluding string, which is immutable)
  • Clone the object, then assert Assert.NotSame() on each reference-type property between original and clone
  • If a developer adds a new List<T> property but forgets to deep-copy it, this test fails immediately because NotSame detects the shared reference

The key insight: this is a structural guard, not a behavioral test. It doesn't test what the clone does — it tests that the clone's shape is correct. Pair it with a count assertion (Assert.Equal(expectedCount, referenceProps.Length)) so the test also fails if someone adds a property but marks it to be skipped.

Great Answer Bonus "Take it further: create a [DeepCloneIgnore] custom attribute for intentionally shared references (like a logger or config), and have the reflection test skip those. Now you have a self-documenting system where every shared reference is an explicit opt-in decision, not an accidental omission."
Think First Events are backed by delegates. What happens when MemberwiseClone copies a delegate field?

Events in C# are backed by delegate fields (reference types). MemberwiseClone() copies the delegate reference — meaning the clone's event points to the same subscriber list as the original.

The result: raising an event on the clone fires handlers that were subscribed to the original. In a UI application, changing a property on a cloned ViewModel updates the wrong UI panel. The fix: never copy event fields during cloning. Create a new instance and let the new context subscribe fresh handlers.

Great Answer Bonus "Note that delegates in .NET are immutable — subscribing creates a new multicast delegate. So after cloning, if the original gets a new subscriber, the clone's delegate is unchanged. But the EXISTING subscribers are still shared, which is the real bug."
Think First When does a clone actually NEED its own copy of the data?

Copy-on-write (CoW) delays the deep copy until a mutation actually occurs. The clone initially shares data with the original (cheap), and only creates a private copy of a field when that field is about to be modified.

.NET's ImmutableList<T> and ImmutableDictionary<K,V> use a variant of this via structural sharingInstead of copying the entire data structure, new versions share unchanged internal tree nodes with the previous version. Adding one item to a 1M-element ImmutableList creates ~20 new nodes (O(log n)) while sharing 999,980 existing nodes. This gives you the semantics of immutability with the memory efficiency of mutation.. For custom objects, you can wrap mutable fields in a Lazy<T> or use a flag to track whether the field has been "detached" from the original.

This is most useful when clones are read-heavy — many clones are created but few are actually modified. The unmodified clones share data with the original, saving memory and allocation time.

Great Answer Bonus "Mention that Git uses copy-on-write internally — branches are cheap because they share all unchanged files with the parent. Only modified files get new copies. Same principle applied to software objects."
Think First What does a source generator do differently from reflection or serialization?

Pros:

  • Zero reflection at runtime — all mapping code is generated at compile time
  • AOT-compatible (works with Native AOT / trimming)
  • Type-safe — compile errors if you add a property without updating the mapper
  • Fast — comparable to hand-written copy code
  • Can selectively ignore fields (e.g., [MapperIgnoreTarget(nameof(Id))])

Cons:

  • Requires a third-party package (Mapperly, MapStruct equivalent)
  • Shallow by default — you need to configure nested type mappings
  • Generated code can be hard to debug
  • Doesn't handle circular references automatically
Great Answer Bonus "Compare with AutoMapper (runtime reflection, slower, more flexible) vs Mapperly (compile-time source gen, faster, less flexible). For cloning specifically, Mapperly is the better choice."
Think First Why can't you just call Clone() on a tracked EF entity?

EF Core's change tracker complicates cloning. A tracked entity is "attached" to a DbContext. If you clone it and call Add(), EF might see duplicate key conflicts or try to update the original instead of inserting a new row.

The correct approach:

  • Load the source entity with AsNoTracking() — this gives you a detached entity
  • Include all navigation properties you want to clone (eager loading)
  • Reset the primary key (Id = 0) on the root AND all child entities
  • Reset audit fields (CreatedAt, RowVersion)
  • Call db.Entity.Add(clone) — EF treats it as a brand new entity
  • Call SaveChangesAsync() — the database generates new IDs
Great Answer Bonus "Warn about owned types and value objects — EF Core has special tracking for these. Cloning an entity with owned types requires extra care to ensure the owned objects are also treated as new."

Hard

Think First How is this different from an object pool? When would pre-cloning be better than on-demand cloning?

A prototype pool pre-clones N instances and serves them from a buffer. When the buffer is empty, it refills by cloning more from the prototype. This amortizes clone cost across requests.

PrototypePool.cs
public sealed class PrototypePool<T> where T : class, IPrototype<T>
{
    private readonly T _prototype;
    private readonly Channel<T> _pool;
    private readonly int _refillSize;

    public PrototypePool(T prototype, int capacity = 100, int refillSize = 25)
    {
        _prototype = prototype;
        _refillSize = refillSize;
        _pool = Channel.CreateBounded<T>(capacity);
        Refill(capacity); // pre-clone initial batch
    }

    public async ValueTask<T> RentAsync(CancellationToken ct = default)
    {
        if (_pool.Reader.TryRead(out var item))
            return item;

        // Pool empty — refill in background, return one clone immediately
        _ = Task.Run(() => Refill(_refillSize), ct);
        return _prototype.DeepClone();
    }

    private void Refill(int count)
    {
        for (int i = 0; i < count; i++)
        {
            if (!_pool.Writer.TryWrite(_prototype.DeepClone()))
                break; // pool is full
        }
    }
}

Key: Channel<T> is thread-safe and lock-free for single-reader/writer scenarios. The pool is bounded to prevent memory blowup. Background refill keeps the pool warm without blocking callers.

Great Answer Bonus "Contrast with ObjectPool<T> from Microsoft.Extensions.ObjectPool — that pattern reuses and RETURNS objects (rent/return cycle). A prototype pool creates NEW objects from a template (rent only, no return). Different lifecycles, different use cases."
Think First How do you update a prototype template without breaking clones that were already created from the old version?

Use an immutable snapshot approach: each registry update creates a new version (immutable dictionary) rather than mutating the existing one. Previous versions are retained for rollback.

VersionedRegistry.cs
public sealed class VersionedPrototypeRegistry<T> where T : class, IPrototype<T>
{
    private readonly Stack<ImmutableDictionary<string, T>> _versions = new();
    private volatile ImmutableDictionary<string, T> _current;
    private readonly object _lock = new(); // Lock required: Stack<T> is not thread-safe

    public VersionedPrototypeRegistry(IDictionary<string, T> initial)
    {
        _current = initial.ToImmutableDictionary();
        _versions.Push(_current);
    }

    public T Create(string key) =>
        _current.TryGetValue(key, out var proto)
            ? proto.DeepClone()
            : throw new KeyNotFoundException(key);

    public void Update(string key, T newPrototype)
    {
        lock (_lock)
        {
            var next = _current.SetItem(key, newPrototype);
            _versions.Push(next);
            _current = next;
        }
    }

    public void Rollback()
    {
        lock (_lock)
        {
            if (_versions.Count <= 1) throw new InvalidOperationException("No version to rollback to");
            _versions.Pop();
            _current = _versions.Peek();
        }
    }
}
Great Answer Bonus "Note that ImmutableDictionary uses structural sharing — adding one key doesn't copy the entire dictionary. Only O(log n) nodes change, making version snapshots memory-efficient."
Think First What do you do when an object holds an ILogger, HttpClient, or DbContext — types you can't (and shouldn't) clone?

Separate clonable data from non-clonable services. The clone operation should only copy data fields; service dependencies should be re-injected or shared.

  • Data fields (Name, Stats, Config) — deep clone these
  • Immutable services (ILogger, IConfiguration) — share the same reference (safe because they're thread-safe and stateless)
  • Stateful resources (Stream, DbContext) — do NOT clone. Either pass via method injection or resolve from DI in the clone's consumer.

Pattern: extract data into a separate EntityData class that implements IPrototype<EntityData>. The service wrapper holds a reference to data + services. Clone only the data, inject the services.

Great Answer Bonus "This is why the 'pure data' approach (records, DTOs) works so well with Prototype — no service dependencies to worry about. The pattern works best when your clonable objects are pure data."
Think First How would you register a prototype in IServiceCollection so each resolution returns a clone?

Register the prototype as Singleton (the template), then register a Transient factory that clones it on each resolution:

DiPrototype.cs
// Register the prototype template as Singleton
builder.Services.AddSingleton(new ReportTemplate
{
    Sections = { "Header", "Compliance", "Footer" },
    Compliance = new ComplianceData { GdprChecked = true }
});

// Register a Transient factory that clones the template per request
builder.Services.AddTransient<Report>(sp =>
{
    var template = sp.GetRequiredService<ReportTemplate>();
    return template.DeepClone();  // fresh clone per resolution
});

// Each controller gets its own independent Report instance
public class ReportController(Report report) { ... }

With .NET 8 Keyed ServicesA .NET 8 DI feature that lets you register multiple implementations of the same service type, distinguished by a key. builder.Services.AddKeyedSingleton<ICache>("redis", new RedisCache()); Resolve with [FromKeyedServices("redis")] ICache cache or sp.GetRequiredKeyedService<ICache>("redis"). Perfect for prototype registries in DI., you can register multiple named prototypes:

KeyedPrototype.cs
builder.Services.AddKeyedSingleton("engineering", new ReportTemplate { Department = "Engineering" });
builder.Services.AddKeyedSingleton("marketing", new ReportTemplate { Department = "Marketing" });

builder.Services.AddTransient<Report>(sp =>
{
    var dept = sp.GetRequiredService<ICurrentUser>().Department;
    var template = sp.GetRequiredKeyedService<ReportTemplate>(dept);
    return template.DeepClone();
});
Great Answer Bonus "Warn about lifetime mismatch: if the prototype is Singleton but the clone holds Scoped services, you have a captive dependency. Keep prototypes as pure data — no injected services."
Think First How do expression trees achieve near-manual performance with the convenience of automatic cloning?

Expression-tree cloning (used by libraries like DeepCloner) analyzes a type's fields via reflection once, generates an expression tree that performs field-by-field copying, compiles it into a delegate, and caches the delegate. Subsequent clones call the compiled delegate directly — near-native speed.

ApproachSpeedEffortHandles CyclesAOT Compatible
Manual deep cloneFastestHigh (write per class)Manual (visited dict)Yes
Expression treeNear-manualZero (automatic)Yes (library handles)No (Reflection.Emit)
Serialization (JSON)Slow (100x)ZeroWith ReferenceHandlerWith source gen
Source generator (Mapperly)Near-manualLow (partial class)NoYes
Great Answer Bonus "Note that expression-tree cloning is incompatible with Native AOT (which doesn't support runtime code generation). If you're targeting AOT, use Mapperly source generators or manual cloning."
Think First If Prototype clones everything, and structural sharing shares most things, which is more memory-efficient?

Structural sharing is an optimization of the Prototype concept. Instead of copying the entire data structure (O(n) memory), a new "version" shares unchanged nodes with the previous version and only allocates new nodes for changed elements (O(log n) memory for balanced trees).

ImmutableList<T>.Add(item) returns a new list that shares most of its internal tree nodes with the original — only the path from the new item to the root is newly allocated. For a list of 1,000,000 items, adding one item creates ~20 new tree nodes (log2(1M) ≈ 20), not 1,000,001 copies.

This is Prototype's "spirit" (create a new version from an existing one) with a dramatically better implementation for large collections. The trade-off: access patterns change (O(log n) random access instead of O(1) for arrays), and the implementation complexity is hidden inside the immutable collection library.

Great Answer Bonus "Connect to persistent data structures from functional programming (Clojure, Haskell). C#'s System.Collections.Immutable brings these concepts to .NET."
Think First If you don't know the concrete types at compile time, how do you create instances?

This is one of Prototype's strongest use cases. The plugin system loads assemblies at runtime, discovers types implementing IPrototype<T>, creates one instance of each (the prototype), and registers it in the registry. Users create new instances by cloning.

PluginPrototypes.cs
public interface IPlugin : IPrototype<IPlugin>
{
    string Name { get; }
    void Execute();
}

public class PluginRegistry
{
    private readonly Dictionary<string, IPlugin> _prototypes = new();

    public void LoadPlugins(string pluginDir)
    {
        foreach (var dll in Directory.GetFiles(pluginDir, "*.dll"))
        {
            var assembly = Assembly.LoadFrom(dll);
            var pluginTypes = assembly.GetTypes()
                .Where(t => typeof(IPlugin).IsAssignableFrom(t) && !t.IsAbstract);

            foreach (var type in pluginTypes)
            {
                var prototype = (IPlugin)Activator.CreateInstance(type)!;
                _prototypes[prototype.Name] = prototype;
            }
        }
    }

    public IPlugin Create(string name) => _prototypes[name].DeepClone();
}
Great Answer Bonus "Explain why Prototype is better than Factory Method here: Factory Method would require a factory per plugin type (impossible if types are unknown at compile time). Prototype only requires the common IPrototype interface — any type that implements it can be cloned."
Think First When processing a command, why might you need a copy of the current aggregate state?

In Event Sourcing, aggregates are rebuilt by replaying events. This replay is expensive for aggregates with many events. Prototype appears in two places:

  • Snapshotting: Periodically deep-clone the aggregate state and persist the snapshot. Future rebuilds start from the snapshot instead of replaying all events from the beginning.
  • Command validation: Before applying a command, clone the current state, apply the command to the clone, validate the result, and only then apply to the real aggregate. If validation fails, discard the clone — the real state is untouched.

This "speculative execution" pattern uses Prototype as an isolation mechanism: test mutations on a copy, commit only if valid.

Great Answer Bonus "Connect to the Memento pattern: the snapshot IS a memento, and Prototype is the mechanism that creates it. The event store is the command log, and the snapshot is the optimization layer."
Think First If Circle extends Shape, and Shape.Clone() uses MemberwiseClone, does it clone Circle's fields?

MemberwiseClone() on a base class DOES copy subclass fields — it copies the entire object regardless of which class declares the method. So Shape.MemberwiseClone() called on a Circle produces a Circle with all fields copied.

The problem arises with manual deep clone: if Shape.DeepClone() creates new Shape { ... }, it slices the Circle fields. The fix: make DeepClone() virtual, override in each subclass, and use the covariant return typeA C# 9 feature that allows an overriding method to return a more derived type than the base method declares. Shape.DeepClone() returns Shape, but Circle.DeepClone() can return Circle. The compiler allows this because Circle IS a Shape — the override is type-safe and avoids casting. feature (C# 9+):

InheritanceClone.cs
public abstract class Shape
{
    public double X { get; set; }
    public double Y { get; set; }
    public abstract Shape DeepClone();  // each subclass overrides
}

public class Circle : Shape
{
    public double Radius { get; set; }
    public override Circle DeepClone() => new()  // covariant return (C# 9+)
        { X = X, Y = Y, Radius = Radius };
}

public class Rectangle : Shape
{
    public double Width { get; set; }
    public double Height { get; set; }
    public override Rectangle DeepClone() => new()
        { X = X, Y = Y, Width = Width, Height = Height };
}
Great Answer Bonus "Mention that C# 9's covariant return types are perfect for Prototype + inheritance: Circle.DeepClone() returns Circle, not Shape, while still satisfying the abstract Shape.DeepClone() contract."
Think First How would you store the canvas state at each step without consuming excessive memory?

Before each mutation, deep-clone the canvas state and push it onto an undo stack. Undo = pop the stack and replace current state. Redo = push current onto a redo stack before restoring.

UndoSystem.cs
public class Canvas : IPrototype<Canvas>
{
    public List<Shape> Shapes { get; set; } = new();

    public Canvas DeepClone() => new()
    {
        Shapes = Shapes.Select(s => s.DeepClone()).ToList()
    };
}

public class UndoManager
{
    private readonly Stack<Canvas> _undoStack = new();
    private readonly Stack<Canvas> _redoStack = new();
    private Canvas _current;

    public UndoManager(Canvas initial) => _current = initial;

    public void Execute(Action<Canvas> mutation)
    {
        _undoStack.Push(_current.DeepClone());  // snapshot BEFORE mutation
        _redoStack.Clear();                      // new action clears redo
        mutation(_current);
    }

    public void Undo()
    {
        if (_undoStack.Count == 0) return;
        _redoStack.Push(_current.DeepClone());
        _current = _undoStack.Pop();
    }

    public void Redo()
    {
        if (_redoStack.Count == 0) return;
        _undoStack.Push(_current.DeepClone());
        _current = _redoStack.Pop();
    }
}
Great Answer Bonus "Discuss the memory optimization: limit undo stack depth (e.g., last 50 actions), or use incremental diffs (Command pattern) instead of full snapshots for small changes, reserving Prototype-based snapshots for expensive periodic checkpoints."
Think First Which cloning approaches use reflection, and what happens to reflection-based code under AOT?

Native AOTAhead-of-Time compilation that produces a self-contained native executable with no runtime JIT compilation needed. Available since .NET 7 (preview) and .NET 8 (stable). The executable starts instantly and has a smaller memory footprint, but cannot use runtime code generation (Reflection.Emit, dynamic assemblies, etc.). eliminates the JIT compiler — all code must be known at compile time. This affects Prototype in several ways:

  • Manual deep clone — fully AOT-compatible. No reflection, no runtime code generation.
  • MemberwiseClone() — AOT-compatible (CLR intrinsic, not reflection).
  • Records with with — AOT-compatible (compiler-generated code).
  • Expression-tree cloning (DeepCloner) — NOT AOT-compatible (uses Reflection.Emit).
  • BinaryFormatter — Removed entirely in .NET 9.
  • System.Text.Json — AOT-compatible WITH source generators ([JsonSerializable]).
  • Mapperly — Fully AOT-compatible (source-generated at compile time).

Trimming also removes unused types. If a prototype registry dynamically loads types, the trimmer might remove those types. Mark them with [DynamicallyAccessedMembers]A .NET trimming annotation that tells the linker "this Type reference needs specific members preserved." Without it, the trimmer may remove types it thinks are unused — but your plugin loader needs them at runtime. Apply to Type parameters in prototype registries and plugin systems. or use source generators to preserve them.

Great Answer Bonus "Recommend a decision matrix: use manual DeepClone for hot paths and AOT. Use Mapperly source generators for convenience + AOT. Use JSON with source gen for graphs with circular refs + AOT. Never use expression-tree or BinaryFormatter approaches in AOT scenarios."
Section 18

Practice Exercises

Exercise 1: Find the 3 Bugs in This Clone Easy

This Product class has a DeepClone() method with 3 bugs. Find and fix them all. (Hint: run the test mentally — what would fail?)

BuggyProductClone.cs
public class Product : IPrototype<Product>
{
    public Guid Id { get; set; } = Guid.NewGuid();
    public string Name { get; set; } = "";
    public decimal Price { get; set; }
    public List<string> Tags { get; set; } = new();
    public Dictionary<string, string> Attributes { get; set; } = new();

    public Product DeepClone() => new()
    {
        Id = this.Id,                  // Bug 1: ???
        Name = this.Name,
        Price = this.Price,
        Tags = this.Tags,             // Bug 2: ???
        Attributes = this.Attributes  // Bug 3: ???
    };
}
  • Bug 1: If two products share the same Id, what happens when you save them to a database?
  • Bug 2: Tags = this.Tags copies the reference, not the list. Add a tag to the clone — does the original's list change too?
  • Bug 3: Same problem as Bug 2 but with the dictionary. Both objects point to the same Dictionary instance.
FixedProductClone.cs
public class Product : IPrototype<Product>
{
    public Guid Id { get; set; } = Guid.NewGuid();
    public string Name { get; set; } = "";
    public decimal Price { get; set; }
    public List<string> Tags { get; set; } = new();
    public Dictionary<string, string> Attributes { get; set; } = new();

    public Product DeepClone() => new()
    {
        Id = Guid.NewGuid(),                                    // Fix 1: new identity
        Name = this.Name,
        Price = this.Price,
        Tags = new List<string>(this.Tags),                    // Fix 2: copy the list
        Attributes = new Dictionary<string, string>(this.Attributes) // Fix 3: copy the dict
    };
}

[Fact]
public void DeepClone_AllBugsFixed()
{
    var original = new Product
    {
        Name = "Widget", Price = 9.99m,
        Tags = { "sale", "new" },
        Attributes = { ["color"] = "blue" }
    };

    var clone = original.DeepClone();

    // Fix 1: Clone gets its own identity
    Assert.NotEqual(original.Id, clone.Id);

    // Fix 2: Collections are independent
    clone.Tags.Add("featured");
    Assert.Equal(2, original.Tags.Count);

    // Fix 3: Dictionary is independent
    clone.Attributes["color"] = "red";
    Assert.Equal("blue", original.Attributes["color"]);
}
Exercise 2: Deep Clone a Document with Nested Sections Medium

Create a Document with a List<Section> where each Section has a Heading, Content, and List<Comment>. Implement deep clone for the entire hierarchy. Test that modifying a comment in the clone doesn't affect the original.

  • Each level needs its own DeepClone: Comment.DeepClone(), Section.DeepClone(), Document.DeepClone()
  • Use LINQ Select to clone each element: Sections.Select(s => s.DeepClone()).ToList()
  • Test at all levels: modify a section heading, add a comment, change a comment's text
NestedClone.cs
public class Comment : IPrototype<Comment>
{
    public string Author { get; set; } = "";
    public string Text { get; set; } = "";
    public DateTime Posted { get; set; } = DateTime.UtcNow;

    public Comment DeepClone() => new()
        { Author = Author, Text = Text, Posted = Posted };
}

public class Section : IPrototype<Section>
{
    public string Heading { get; set; } = "";
    public string Content { get; set; } = "";
    public List<Comment> Comments { get; set; } = new();

    public Section DeepClone() => new()
    {
        Heading = Heading, Content = Content,
        Comments = Comments.Select(c => c.DeepClone()).ToList()
    };
}

public class Document : IPrototype<Document>
{
    public string Title { get; set; } = "";
    public List<Section> Sections { get; set; } = new();

    public Document DeepClone() => new()
    {
        Title = Title,
        Sections = Sections.Select(s => s.DeepClone()).ToList()
    };
}

[Fact]
public void DeepClone_NestedComments_AreIndependent()
{
    var doc = new Document
    {
        Title = "Spec",
        Sections =
        {
            new Section
            {
                Heading = "Intro",
                Comments = { new Comment { Author = "Alice", Text = "LGTM" } }
            }
        }
    };

    var clone = doc.DeepClone();
    clone.Sections[0].Comments[0].Text = "Needs revision";

    Assert.Equal("LGTM", doc.Sections[0].Comments[0].Text);
}
Exercise 3: Build a Prototype Registry for Game Enemies Medium

Create a PrototypeRegistry<T> that stores named prototypes and returns deep clones on demand. It should support adding new prototypes, removing prototypes, and listing available keys. Write tests for spawn correctness and prototype isolation.

  • Use Dictionary<string, T> internally where T : IPrototype<T>
  • The Create(key) method should clone then return — never expose the prototype directly
  • Consider: what should happen when the key doesn't exist? Throw? Return null?
GenericRegistry.cs
public class PrototypeRegistry<T> where T : class, IPrototype<T>
{
    private readonly Dictionary<string, T> _prototypes = new(StringComparer.OrdinalIgnoreCase);

    public void Register(string key, T prototype) => _prototypes[key] = prototype;
    public bool Remove(string key) => _prototypes.Remove(key);
    public IReadOnlyCollection<string> Keys => _prototypes.Keys;

    public T Create(string key)
    {
        if (!_prototypes.TryGetValue(key, out var proto))
            throw new KeyNotFoundException($"No prototype registered for '{key}'");
        return proto.DeepClone();
    }

    public T? TryCreate(string key) =>
        _prototypes.TryGetValue(key, out var proto) ? proto.DeepClone() : null;
}

[Fact]
public void Registry_SpawnedEntities_AreIndependent()
{
    var registry = new PrototypeRegistry<GameEntity>();
    registry.Register("goblin", new GameEntity
    {
        Name = "Goblin",
        Stats = new Stats { Health = 30, Attack = 8 }
    });

    var g1 = registry.Create("goblin");
    var g2 = registry.Create("goblin");
    g1.Stats.Health = 0;

    Assert.Equal(30, g2.Stats.Health);  // independent
    Assert.NotEqual(g1.InstanceId, g2.InstanceId);  // unique IDs
}

[Fact]
public void Registry_UnknownKey_Throws()
{
    var registry = new PrototypeRegistry<GameEntity>();
    Assert.Throws<KeyNotFoundException>(() => registry.Create("dragon"));
}
Exercise 4: Thread-Safe Prototype Pool with Concurrent Access Hard

Build a PrototypePool<T> that pre-clones N instances and serves them under high concurrency. When the pool empties, it should refill asynchronously. Write a stress test that spawns 1000 items across 10 threads and verifies all are unique and valid.

  • Use Channel<T> (System.Threading.Channels) for the thread-safe buffer
  • Pre-fill the channel in the constructor
  • If TryRead fails, clone directly from the prototype and trigger a background refill
  • For the stress test, use Parallel.ForEachAsync with MaxDegreeOfParallelism = 10
ThreadSafePool.cs
using System.Threading.Channels;

public sealed class PrototypePool<T> where T : class, IPrototype<T>
{
    private readonly T _prototype;
    private readonly Channel<T> _pool;
    private readonly int _refillBatch;
    private int _refilling; // 0 = idle, 1 = refilling

    public PrototypePool(T prototype, int capacity = 100, int refillBatch = 25)
    {
        _prototype = prototype;
        _refillBatch = refillBatch;
        _pool = Channel.CreateBounded<T>(new BoundedChannelOptions(capacity)
        {
            SingleWriter = false,
            SingleReader = false,
            FullMode = BoundedChannelFullMode.DropWrite
        });
        RefillSync(capacity);
    }

    public T Rent()
    {
        if (_pool.Reader.TryRead(out var item))
        {
            TriggerRefillIfNeeded();
            return item;
        }
        return _prototype.DeepClone(); // fallback: clone on demand
    }

    private void TriggerRefillIfNeeded()
    {
        if (_pool.Reader.Count < _refillBatch &&
            Interlocked.CompareExchange(ref _refilling, 1, 0) == 0)
        {
            Task.Run(() =>
            {
                RefillSync(_refillBatch);
                Interlocked.Exchange(ref _refilling, 0);
            });
        }
    }

    private void RefillSync(int count)
    {
        for (int i = 0; i < count; i++)
            if (!_pool.Writer.TryWrite(_prototype.DeepClone())) break;
    }
}

[Fact]
public async Task Pool_ConcurrentAccess_AllUnique()
{
    var prototype = new GameEntity
    {
        Name = "Goblin",
        Stats = new Stats { Health = 30 }
    };
    var pool = new PrototypePool<GameEntity>(prototype, capacity: 200);
    var bag = new ConcurrentBag<Guid>();

    await Parallel.ForEachAsync(
        Enumerable.Range(0, 1000),
        new ParallelOptions { MaxDegreeOfParallelism = 10 },
        async (_, _) =>
        {
            var entity = pool.Rent();
            bag.Add(entity.InstanceId);
            Assert.Equal(30, entity.Stats.Health);
            await Task.CompletedTask;
        });

    Assert.Equal(1000, bag.Count);
    Assert.Equal(1000, bag.Distinct().Count()); // all unique IDs
}
Section 19

Cheat Sheet

Core Interface
public interface IPrototype<out T>
    where T : class
{
    T DeepClone();
}

// Implement on every clonable type
// Returns CONCRETE type, not object
// Never use ICloneable
Deep Clone Checklist
✅ Value types → assign directly
✅ Strings → assign (immutable)
✅ List<T> → new List<T>(source)
✅ Dict → new Dictionary(source)
✅ Nested objects → obj.DeepClone()
❌ Events → do NOT copy
❌ IDs → reset to 0/NewGuid
❌ IDisposable → do NOT clone
Clone Strategy Decision
Value-types only?
  → MemberwiseClone() (~5ns)
Immutable record?
  → with expression (~10ns)
Complex + control needed?
  → Manual DeepClone (~200ns)
Arbitrary graph?
  → JSON/MessagePack (~10μs)
AOT required?
  → Manual or Mapperly
Prototype Registry
Dictionary<string, IPrototype<T>>

Register: registry[key] = proto
Create:   registry[key].DeepClone()
Thread-safe: ConcurrentDictionary
Immutable: ImmutableDictionary

NEVER expose prototype directly
ALWAYS return clones
Common Bugs
1. Shallow copy shares lists
2. ICloneable returns object
3. Circular refs → StackOverflow
4. Events copied → wrong handlers
5. Same ID → duplicate key in DB
6. Thread-unsafe registry

Fix: typed deep clone + tests
that mutate clone & assert
original unchanged
Modern .NET Alternatives
Records + with:
  var b = a with { X = 1 };
  (shallow, type-safe)

ImmutableList.Add():
  Structural sharing
  (efficient "clone")

Mapperly source gen:
  Compile-time mapping
  (AOT, fast, typed)
Section 20

Deep Dive

Three topics that go beyond the basics. Understanding recordsC# 9+ reference types declared with the 'record' keyword. They get compiler-generated Equals, GetHashCode, ToString, and a Clone method exposed through 'with' expressions. Records are immutable by convention (init-only setters) and provide value-based equality — two records with the same property values are considered equal., registriesA Prototype Registry (or Prototype Manager) is a centralized catalog of named prototype instances. It maps string keys to pre-configured objects. Clients request clones by name, and the registry handles the deep copy internally. Common in game engines (enemy catalogs), document systems (template libraries), and plugin architectures., and object poolsA creational pattern that pre-allocates a fixed set of objects and "rents" them to consumers. Unlike Prototype (which creates new copies), pools REUSE existing objects. The consumer must RETURN the object when done. Used for expensive-to-allocate resources: buffers (ArrayPool), database connections (connection pooling), threads (ThreadPool). will make you dangerous — these are the conversations that separate "I know Prototype" from "I've used Prototype in production."

Records and with — The Modern C# Prototype

C# records are Prototype at the language level. Let's look at what the compiler actually generates and where the boundaries are.

RecordDeepDive.cs
// What you write:
public record Person(string Name, int Age, Address Address);

// What the compiler generates (simplified):
public class Person : IEquatable<Person>
{
    public string Name { get; init; }
    public int Age { get; init; }
    public Address Address { get; init; }

    // Copy constructor — the heart of Prototype
    protected Person(Person original)
    {
        this.Name = original.Name;
        this.Age = original.Age;
        this.Address = original.Address;  // ← shallow reference copy!
    }

    // Hidden clone method used by 'with'
    public virtual Person <Clone>$() => new Person(this);

    // 'var b = a with { Name = "Bob" };' compiles to:
    // var temp = a.<Clone>$();
    // temp.Name = "Bob";
    // var b = temp;
}

// The deep copy problem with records:
public record Order(int Id, List<string> Items);

var a = new Order(1, new List<string> { "Widget" });
var b = a with { Id = 2 };
b.Items.Add("Gadget");  // ❌ Also adds to a.Items! Shallow copy!

// Solution: use immutable collections with records
public record SafeOrder(int Id, ImmutableList<string> Items);

var sa = new SafeOrder(1, ImmutableList.Create("Widget"));
var sb = sa with { Items = sa.Items.Add("Gadget") };
// sa.Items has 1 item, sb.Items has 2 — independent via structural sharing
Rule of Thumb: Records + with are safe when ALL properties are either: value types, strings, other records, or immutable collections. The moment you have a mutable List<T> or a mutable class, the shallow clone bites you. Design records with immutability in mind and you get Prototype for free.

The Prototype Registry (also called Prototype Manager) is a centralized catalog of pre-configured prototype instances. It's the "warehouse" version of Prototype — instead of each client knowing how to create prototypes, a single registry holds all templates and serves clones on demand.

RegistryDeepDive.cs
// Production-grade prototype registry with validation and lifecycle
public sealed class PrototypeRegistry<T> : IDisposable where T : class, IPrototype<T>
{
    private readonly ConcurrentDictionary<string, T> _store = new(StringComparer.OrdinalIgnoreCase);
    private readonly Func<T, bool>? _validator;

    public PrototypeRegistry(Func<T, bool>? validator = null) => _validator = validator;

    public void Register(string key, T prototype)
    {
        ArgumentException.ThrowIfNullOrWhiteSpace(key);
        ArgumentNullException.ThrowIfNull(prototype);

        if (_validator is not null && !_validator(prototype))
            throw new ArgumentException($"Prototype '{key}' failed validation");

        // Store a CLONE of the input — caller can't mutate our template
        _store[key] = prototype.DeepClone();
    }

    public T Create(string key)
    {
        if (!_store.TryGetValue(key, out var proto))
            throw new KeyNotFoundException($"No prototype: '{key}'. Available: {string.Join(", ", _store.Keys)}");
        return proto.DeepClone();
    }

    public IReadOnlyCollection<string> Keys => _store.Keys.ToList().AsReadOnly();
    public bool Contains(string key) => _store.ContainsKey(key);
    public bool Remove(string key) => _store.TryRemove(key, out _);

    public void Dispose()
    {
        foreach (var proto in _store.Values)
            if (proto is IDisposable d) d.Dispose();
        _store.Clear();
    }
}

// DI Registration — make the registry available application-wide
builder.Services.AddSingleton(sp =>
{
    var registry = new PrototypeRegistry<ReportTemplate>(t => !string.IsNullOrEmpty(t.Title));
    registry.Register("quarterly", LoadQuarterlyTemplate());
    registry.Register("monthly", LoadMonthlyTemplate());
    registry.Register("annual", LoadAnnualTemplate());
    return registry;
});
Key Design Decision

Notice Register() clones the input prototype before storing it. This prevents a subtle bug: if the caller modifies the object they passed in after registration, it would corrupt the stored template. Clone-on-register + clone-on-create = maximum isolation.

Object Pool and Prototype both avoid expensive object creation, but they have fundamentally different lifecycles:

AspectObject PoolPrototype
LifecycleRent → Use → Return → ReuseClone → Use → Discard
StateMust be RESET between usesStarts with TEMPLATE state
OwnershipPool owns the objectsCaller owns the clone
ConcurrencyOne user at a time per instanceEach user gets their own copy
MemoryFixed pool size (bounded)Unbounded (new objects each time)
.NET ExampleArrayPool<T>, ObjectPool<T>Records with with, DeepClone()
Best ForExpensive-to-allocate, stateless objects (buffers, connections)Expensive-to-configure, stateful objects (templates, configs)
PoolVsPrototype.cs
// Object Pool: rent a buffer, use it, return it
var pool = ArrayPool<byte>.Shared;
var buffer = pool.Rent(1024);  // might get a REUSED buffer
try { /* use buffer */ }
finally { pool.Return(buffer); }  // MUST return — or memory leak

// Prototype: clone a template, use the clone, let GC collect it
var report = reportRegistry.Create("quarterly");  // NEW clone
report.Department = "Engineering";
await SaveReport(report);
// No need to "return" anything — GC handles it

// Hybrid: pre-cloned pool (clone from prototype, serve from pool)
// Best of both: no expensive construction, no shared state
Section 21

Real-World Mini-Project

Let's build a Game Entity Spawner — from a naive first attempt to a battle-tested implementation that handles deep cloning, registries, and concurrency.

Attempt 1: Naive (Broken)

NaiveSpawner.cs
// ATTEMPT 1: A junior's first try at enemy spawning
public class Enemy
{
    public string Type { get; set; } = "";
    public int Health { get; set; }
    public int Attack { get; set; }
    public List<string> Abilities { get; set; } = new();
    public Dictionary<string, int> Resistances { get; set; } = new();

    public Enemy ShallowClone() => (Enemy)MemberwiseClone(); // shallow!
}

public class NaiveSpawner
{
    private readonly Dictionary<string, Enemy> _templates = new();

    public NaiveSpawner()
    {
        _templates["goblin"] = new Enemy
        {
            Type = "Goblin", Health = 30, Attack = 8,
            Abilities = { "slash" }, Resistances = { ["fire"] = -10 }
        };
    }

    public Enemy Spawn(string type)
    {
        var template = _templates[type];
        return template.ShallowClone(); // ❌ shallow copy!
    }
}

// In the game loop:
var spawner = new NaiveSpawner();
var g1 = spawner.Spawn("goblin");
g1.Abilities.Add("dodge");    // ❌ Also adds to template!
g1.Health = 0;                // ✓ This is fine (value type)

var g2 = spawner.Spawn("goblin");
// g2.Abilities = ["slash", "dodge"] ← corrupted by g1!
3 Critical Bugs

1. Shallow clone shares Abilities list and Resistances dictionary — modifying one enemy corrupts all future spawns.
2. No unique instance ID — can't track individual enemies.
3. Not thread-safe — concurrent spawns on a game server corrupt the template.

BetterSpawner.cs
// ATTEMPT 2: Deep clone with typed interface
public class Enemy : IPrototype<Enemy>
{
    public Guid InstanceId { get; private set; } = Guid.NewGuid();
    public string Type { get; set; } = "";
    public int Health { get; set; }
    public int Attack { get; set; }
    public List<string> Abilities { get; set; } = new();
    public Dictionary<string, int> Resistances { get; set; } = new();

    public Enemy DeepClone() => new()
    {
        // InstanceId auto-generates via Guid.NewGuid() default
        Type = this.Type,
        Health = this.Health,
        Attack = this.Attack,
        Abilities = new List<string>(this.Abilities),
        Resistances = new Dictionary<string, int>(this.Resistances)
    };
}

public class BetterSpawner
{
    private readonly Dictionary<string, Enemy> _templates = new();

    public BetterSpawner()
    {
        _templates["goblin"] = new Enemy
        {
            Type = "Goblin", Health = 30, Attack = 8,
            Abilities = { "slash" }, Resistances = { ["fire"] = -10 }
        };
    }

    public Enemy Spawn(string type) => _templates[type].DeepClone();
}

// Now: g1.Abilities.Add("dodge") does NOT affect template or g2
Remaining Issues

1. Not thread-safe — Dictionary isn't safe for concurrent reads during writes.
2. No way to add/update templates at runtime without race conditions.
3. No validation — what if someone registers an enemy with 0 health?

Enemy.cs
public interface IPrototype<out T> where T : class
{
    T DeepClone();
}

public sealed class Enemy : IPrototype<Enemy>
{
    public Guid InstanceId { get; private set; } = Guid.NewGuid();
    public string Type { get; init; } = "";
    public int Health { get; set; }
    public int MaxHealth { get; init; }
    public int Attack { get; init; }
    public double Speed { get; init; }
    public List<string> Abilities { get; init; } = new();
    public Dictionary<string, int> Resistances { get; init; } = new();

    public Enemy DeepClone() => new()
    {
        Type = Type,
        Health = MaxHealth,          // clones spawn at full health
        MaxHealth = MaxHealth,
        Attack = Attack,
        Speed = Speed,
        Abilities = new List<string>(Abilities),
        Resistances = new Dictionary<string, int>(Resistances)
    };

    public override string ToString() =>
        $"{Type} [{InstanceId.ToString()[..8]}] HP:{Health}/{MaxHealth}";
}
EnemySpawner.cs
public sealed class EnemySpawner
{
    private readonly ConcurrentDictionary<string, Enemy> _prototypes = new(StringComparer.OrdinalIgnoreCase);

    public void RegisterPrototype(string key, Enemy prototype)
    {
        ArgumentException.ThrowIfNullOrWhiteSpace(key);
        if (prototype.MaxHealth <= 0) throw new ArgumentException("MaxHealth must be positive");
        _prototypes[key] = prototype.DeepClone(); // clone-on-register for safety
    }

    public Enemy Spawn(string type)
    {
        if (!_prototypes.TryGetValue(type, out var proto))
            throw new ArgumentException($"Unknown enemy: '{type}'. Available: {string.Join(", ", _prototypes.Keys)}");
        return proto.DeepClone();
    }

    public List<Enemy> SpawnWave(string type, int count) =>
        Enumerable.Range(0, count).Select(_ => Spawn(type)).ToList();

    public IReadOnlyCollection<string> AvailableTypes => _prototypes.Keys.ToList().AsReadOnly();
}
Program.cs
builder.Services.AddSingleton(sp =>
{
    var spawner = new EnemySpawner();

    spawner.RegisterPrototype("goblin", new Enemy
    {
        Type = "Goblin", MaxHealth = 30, Health = 30, Attack = 8, Speed = 1.2,
        Abilities = { "slash", "dodge" }, Resistances = { ["fire"] = -10 }
    });

    spawner.RegisterPrototype("orc", new Enemy
    {
        Type = "Orc", MaxHealth = 80, Health = 80, Attack = 15, Speed = 0.8,
        Abilities = { "slam", "roar", "enrage" },
        Resistances = { ["fire"] = 5, ["ice"] = -5 }
    });

    spawner.RegisterPrototype("dragon", new Enemy
    {
        Type = "Dragon", MaxHealth = 500, Health = 500, Attack = 40, Speed = 1.5,
        Abilities = { "flame-breath", "tail-swipe", "fly", "intimidate" },
        Resistances = { ["fire"] = 100, ["ice"] = -20, ["physical"] = 15 }
    });

    return spawner;
});
EnemySpawnerTests.cs
[Fact]
public void Spawn_ReturnsIndependentClone()
{
    var spawner = CreateSpawner();
    var g1 = spawner.Spawn("goblin");
    var g2 = spawner.Spawn("goblin");

    g1.Health = 0;
    g1.Abilities.Add("flee");

    Assert.Equal(30, g2.Health);                     // independent HP
    Assert.DoesNotContain("flee", g2.Abilities);     // independent abilities
    Assert.NotEqual(g1.InstanceId, g2.InstanceId);   // unique IDs
}

[Fact]
public void SpawnWave_AllUnique()
{
    var spawner = CreateSpawner();
    var wave = spawner.SpawnWave("orc", 50);

    Assert.Equal(50, wave.Count);
    Assert.Equal(50, wave.Select(e => e.InstanceId).Distinct().Count());
    Assert.All(wave, e => Assert.Equal(80, e.Health));
}

[Fact]
public async Task ConcurrentSpawns_AreThreadSafe()
{
    var spawner = CreateSpawner();
    var bag = new ConcurrentBag<Enemy>();

    await Parallel.ForEachAsync(
        Enumerable.Range(0, 1000),
        new ParallelOptions { MaxDegreeOfParallelism = 10 },
        async (_, _) =>
        {
            bag.Add(spawner.Spawn("goblin"));
            await Task.CompletedTask;
        });

    Assert.Equal(1000, bag.Count);
    Assert.Equal(1000, bag.Select(e => e.InstanceId).Distinct().Count());
}

private static EnemySpawner CreateSpawner()
{
    var spawner = new EnemySpawner();
    spawner.RegisterPrototype("goblin", new Enemy
    {
        Type = "Goblin", MaxHealth = 30, Health = 30, Attack = 8, Speed = 1.2,
        Abilities = { "slash" }, Resistances = { ["fire"] = -10 }
    });
    spawner.RegisterPrototype("orc", new Enemy
    {
        Type = "Orc", MaxHealth = 80, Health = 80, Attack = 15, Speed = 0.8,
        Abilities = { "slam", "roar" }, Resistances = { }
    });
    return spawner;
}
Production Ready

Thread-safe via ConcurrentDictionary. Clone-on-register prevents template mutation. Validation catches invalid prototypes at registration. Unique InstanceId per spawn. Full deep independence — verified by concurrent stress tests.

Section 22

Migration Guide

Step 1: Identify Expensive Object Construction

Step1_Identify.cs
// BEFORE: Every call to CreateReport builds from scratch
public Report CreateReport(string department)
{
    var report = new Report();
    report.LoadTemplates();          // 100ms — reads files from disk
    report.ApplyFormattingRules();   // 50ms — parses rule engine
    report.RunComplianceChecks();    // 50ms — calls external service
    report.Department = department;
    return report;
}
// Called 20 times per day → 20 × 200ms = 4 seconds wasted on identical setup

// STEP 1: Profile and identify the expensive parts
// Template loading, formatting, and compliance are IDENTICAL across departments
// Only the Department assignment varies per call

Risk: Zero. This step is analysis only — no code changes.

Step2_AddClone.cs
// STEP 2: Add the Prototype interface
public interface IPrototype<out T> where T : class { T DeepClone(); }

public class Report : IPrototype<Report>
{
    public string Department { get; set; } = "";
    public List<string> Sections { get; set; } = new();
    public ComplianceData Compliance { get; set; } = new();
    // ... existing properties

    // NEW: deep clone method
    public Report DeepClone() => new()
    {
        Department = this.Department,
        Sections = new List<string>(this.Sections),
        Compliance = this.Compliance.DeepClone()
    };
}

Risk: Minimal. Adding an interface and method doesn't break existing code. Write tests for DeepClone before proceeding.

Step3_Replace.cs
// AFTER: Build the expensive template ONCE, clone for each department
public class ReportFactory
{
    private readonly Report _template;

    public ReportFactory()
    {
        // Expensive setup happens once in constructor
        _template = new Report();
        _template.LoadTemplates();          // 100ms — once
        _template.ApplyFormattingRules();   // 50ms — once
        _template.RunComplianceChecks();    // 50ms — once
    }

    public Report CreateReport(string department)
    {
        var report = _template.DeepClone();  // ~1μs instead of 200ms
        report.Department = department;
        return report;
    }
}

// Register in DI
builder.Services.AddSingleton<ReportFactory>();

// Usage — caller doesn't know about cloning
var report = reportFactory.CreateReport("Engineering");

Risk: Medium. This changes the creation path. Run existing tests, add clone-independence tests, and verify that the cloned reports produce identical output to the old from-scratch reports.

Section 23

Code Review Checklist

Print it, bookmark it, tattoo it on your forearm. Run through this checklist every time you review Prototype-related code:

#CheckWhy It MattersRed Flag
1All mutable reference-type fields are deep-copiedShallow copy of lists/dicts creates shared-state bugsSections = this.Sections (missing new List<>())
2Clone method returns concrete type, not objectAvoids unsafe runtime castspublic object Clone() or ICloneable
3Identity fields (Id, Guid) are reset in clonePrevents duplicate key violations on saveId = this.Id in DeepClone
4Events/delegates are NOT copiedCloned handlers fire on wrong subscribersMemberwiseClone() on class with events
5IDisposable fields are NOT clonedDouble-dispose corrupts resources_stream = this._stream in clone
6Circular references use visited dictionaryInfinite recursion → StackOverflowRecursive DeepClone without cycle detection
7Prototype registry is thread-safeConcurrent reads/writes corrupt stateDictionary instead of ConcurrentDictionary
8Registry never exposes prototype directlyCallers can accidentally mutate templateGetPrototype() returning the stored instance
9Clone independence is testedOnly way to catch shallow-copy bugsNo test that mutates clone and checks original
10Performance justifies the patternPrototype adds complexity — must earn its keepCloning objects that are trivial to construct
Automate it: Enable these Roslyn analyzers to catch Prototype issues at compile time:
  • CA1062 — Validate arguments of public methods (catch null prototypes)
  • CA2000 — Dispose objects before losing scope (catch cloned IDisposable)
  • CA1024 — Use properties where appropriate (flag GetClone() methods)
  • Meziantou.Analyzer — Additional rules for collection copying and equality checks