C# thread, thread pool, Task concept + code practice

Reposted from: JerryMouseLi

cnblogs.com/JerryMouseLi/p/14135600.html

Preface

There are many concepts in threads. If there is no code example to understand, it will be relatively obscure, and some concepts will not be implemented. Therefore, this article uses some sample codes to run and explain some basic concepts in threads. Let yourself and the reader understand the concepts in the thread more deeply.

1. Thread safety

1.1, no thread preemption

class ThreadTest2
{
    bool Done;
    static void Main()
    {
        ThreadTest2 tt = new ThreadTest2(); // Create a common instance
        new Thread(tt. Go). Start();
        tt. Go();
    }
    // Note that Go is now an instance method
    void Go()
    {
            if (!Done)
            {
                Done = true;
                Console. WriteLine("Done");
            }
    }
}

The result of the operation is as follows:

Done

1.2, thread preemption

class ThreadTest2
{
    bool Done;
    static void Main()
    {
        ThreadTest2 tt = new ThreadTest2(); // Create a common instance
        new Thread(tt. Go). Start();
        tt. Go();
    }
    // Note that Go is now an instance method
    void Go()
    {
            if (!Done)
            {
                Console. WriteLine("Done");
                Done = true;
            }
    }
}

The result of the operation is as follows:

Done
Done

Thread preemption example 2:

for (int i = 0; i < 10; i ++ )
  new Thread (() => Console. Write (i)). Start();

operation result

0223557799

1.3. Avoid thread preemption

class ThreadTest2
{
    static readonly object locker = new object();
    bool Done;
    static void Main()
    {
        ThreadTest2 tt = new ThreadTest2(); // Create a common instance
        new Thread(tt. Go). Start();
        tt. Go();
    }
    // Note that Go is now an instance method
    void Go()
    {
        lock (locker)
        {
            if (!Done)
            {
              Console. WriteLine("Done");
              Done = true;
            }
        }
    }
}

The result of the operation is as follows:

Done

Second, thread blocking

class Program
{
    static void Main()
    {
        Thread t = new Thread(Go);
        t.Start();
        t.Join();
        Console.WriteLine("Thread t has ended!");
    }
    static void Go()
    {
        for (int i = 0; i < 1000; i ++ ) Console.Write("y");
    }
}

operation result:

e144b1436cb6dc7f471bd82e36763c28.png

“Thread t has ended!” is output only after 1000 ys are printed.

Thread. Sleep (500);

It will also block the thread and transfer the execution right of the CPU to other threads.

Three, Thread.yield() and Thread.sleep(0)

The effect of sleep(0) is equivalent to yield(), which will cause the current thread to give up the remaining time slice and enter the tail of the thread queue with the same priority. It can only be executed again after all the threads with the same priority in the front are scheduled. Chance.

Fourth, how threads work

Multi-thread pain is managed through the internal thread scheduler (thread scheduler), and the operating system is entrusted through the clr. The thread scheduler will allocate the appropriate execution time to the active thread, and the thread waiting (lock) or thread blocking (user input) will not consume cpu execution time.

On a single-core processor computer, in Windows, the time slice is usually allocated tens of milliseconds, which is much longer than the thread context switching time of several milliseconds.

On multiprocessor computers, multithreading is achieved through time slicing and true concurrency, where different threads run code concurrently on different CPUs. There will almost certainly also be a time slice due to the operating system’s need to service its own threads as well as those of other applications.

A thread is said to be preempted when its execution is interrupted due to external factors such as time slices. In most cases, a thread has no control over when and where it is preempted.

5. Thread and process

Threads are similar to processes. Just like processes run in parallel on a computer, multiple threads run in parallel within a single process. Processes are completely isolated from each other; threads have limited isolation. In particular, threads share (heap) memory with other threads running in the same application. This is why threads are useful: for example, one thread can fetch data in the background, while another thread can display data as it arrives.

6. The use and abuse of threads

  • Facilitates a responsive user interface

Run time-consuming tasks on “worker” threads running concurrently in parallel, leaving the main UI thread free to continue processing keyboard and mouse events.

  • Efficient use of otherwise blocked CPU

Multithreading is useful when a thread is waiting for a response from another computer or hardware. When one thread is blocked while performing a task, other threads can take advantage of other threads of the computer that would otherwise not be burdened to respond to the task.

  • parallel programming

Code that performs intensive computations can execute faster on multi-core or multi-processor computers if the workload is shared among multiple threads in a “divide and conquer” strategy (see Part 5).

  • random execution

On multi-core computers, performance can sometimes be improved by anticipating things that might need to be Done and doing them ahead of time. LINQPad uses this technique to speed up the creation of new queries. A variation is to run many different algorithms in parallel, all solving the same task. Whoever “wins” first, this works great when you don’t know which algorithm will perform the fastest.

  • Allows the service to process requests concurrently

On the server, client requests can arrive at the same time, so parallel processing is required (the .NET Framework automatically creates threads for this if you use ASP.NET, WCF, web services, or remoting). This is also useful on the client side (e.g. handling peer-to-peer networks – even multiple requests from users).

With technologies like ASP.NET and WCF, you don’t know that multithreading is happening – unless you access shared data (perhaps via static fields) without proper locking, breaking thread safety.

Interaction between threads (often by sharing data) introduces a lot of complexity, but it cannot be avoided, so it is necessary to keep the interaction to a minimum and stick to a simple and reliable design as much as possible.

A good strategy is to encapsulate multithreading logic into reusable classes that can be inspected and tested independently. The framework itself provides a number of higher-level threading constructs, which we’ll cover later.

Threading also results in wasted resources and CPU when scheduling and switching threads (if there are more active threads than CPU cores), and also incurs a create/release cost. Multithreading doesn’t always speed up your application – it can even slow it down if used too much or inappropriately. For example, when a lot of disk I/O is involved, it is much faster to have several worker threads running tasks sequentially than 10 threads at a time.

Seven, thread parameters

7.1, lambda expression parameter passing

The most convenient way is to call anonymous methods through lambda expressions and pass parameters.

static void Main()
{
    Thread t = new Thread(() =>Print("Hello from t!"));
    t.Start();
}
static void Print(string message)
{
    Console. WriteLine(message);
}

7.2, thread start method parameter passing

static void Main()
{
    Thread t = new Thread(Print);
    t.Start("Hello from t!");
}
static void Print(object messageObj)
{
    string message = (string) messageObj;
    // We need to cast here
    Console. WriteLine(message);
}

7.3. Thread creation takes time

string text = "t1";
Thread t1 = new Thread ( () => Console. WriteLine (text) );

text = "t2";
Thread t2 = new Thread ( () => Console. WriteLine (text) );

t1. Start();
t2. Start();

operation result:

t2
t2

The above running results show that the text was modified to t2 before the t1 thread was created.

VIII. Thread naming

Each thread has a name attribute for easier debugging.

static void Main()
{
    Thread.CurrentThread.Name = "main";
    Thread worker = new Thread(Go);
    worker.Name = "worker";
    worker. Start();
    Go();
}
static void Go()
{
    Console.WriteLine("Hello from " + Thread.CurrentThread.Name);
}

operation result:

Hello from main
Hello from worker

9. Foreground thread and background thread

Thread worker = new Thread(() => Console. ReadLine());
if (args. Length > 0) worker. IsBackground = true;
worker.Name = "backThread";
worker. Start();
Console.WriteLine("finish!");

The foreground thread will stop when the main thread window is closed, and the background thread will run independently when the main thread window is closed.

10. Thread priority

Thread priority determines how long the operating system executes active threads.

enum ThreadPriority { Lowest, BelowNormal, Normal, AboveNormal, Highest }

Sometimes the priority of the thread is increased, but it still cannot meet some real-time application requirements. At this time, it is necessary to increase the priority of the process, the process process class in the System.Diagnostics namespace.

using (Process p = Process. GetCurrentProcess())
  p.PriorityClass = ProcessPriorityClass.High;

In fact, ProcessPriorityClass.High is 1 level below the highest priority: Realtime. Setting the process priority to Realtime indicates to the OS that you never want that process wasting CPU time on another process. If your program goes into an unintended infinite loop, you might even find that the OS has locked up, leaving only the power button to save you! Therefore, high is usually the best choice for real-time applications.

If your real-time application has a user interface, increasing the processing priority will give too much CPU time to screen updates, slowing down the entire computer (especially if the UI is complex). Lowering the priority of the main thread and raising the priority of the process will ensure that the real-time thread is not preempted by screen redraws, but will not solve the problem of making other applications use up CPU time, because the OS will still allocate the entire process Resources are disproportionate. The ideal solution would be to have the real-time worker and the user interface run as separate applications with different process priorities and communicate via remoting or memory-mapped files. Memory-mapped files are perfect for this task. We briefly describe how they work in Chapters 14 and 25 of C# 4.0.

11. Exception handling

Go cannot catch exceptions, GoCatch can catch the exceptions of the current thread, and output Console.WriteLine(“exception.”); It can be seen that after the thread is created, the exception can only be caught by this thread, if the caller needs to catch it, then It has to be uploaded in the way of shared memory. Task has Done this for us, and the caller can catch the exception of other threads in task.result.

public static void Main()
{
    try
    {
        new Thread(Go).Start();
        Console. ReadKey();
    }
    catch (Exception ex)
    {
        // We'll never get here!
        Console.WriteLine("Exception!");
    }
}
static void Go() { throw null; } // Throws a NullReferenceException
static void GoCatch()
{
    try
    {
        //...
        throw null; // The NullReferenceException will get caught below
                      //...
    }
    catch (Exception ex)
    {
        // Typically log the exception, and/or signal another thread
        // that we've come unstuck
        //...
        Console.WriteLine("exception.");
    }
}

Twelve, thread pool

When you create a thread, hundreds of milliseconds are spent e.g. creating local private variables on the stack. Each thread consumes 1MB of memory by default, allowing multithreading to be applied at a very fine-grained level without affecting performance. This is useful when exploiting multi-core processors to execute computationally intensive code in parallel in a “divide and conquer” fashion.

The thread pool also limits the total number of worker threads that will run concurrently. Having too many active threads limits the management burden on the operating system and invalidates the CPU cache. Once the limit is reached, jobs are queued and only started when another job finishes. This enables arbitrarily concurrent applications, such as web servers. (The Asynchronous Method Pattern is an advanced technique that takes this a step further by making efficient use of pooled threads; we cover this briefly in Chapter 23 of C# 4.0).

There are several ways to enter the thread pool:

  • Via the Task Parallel Library (from Framework 4.0)

  • By calling ThreadPool.QueueUserWorkItem

  • via asynchronous delegation (await)

  • via BackgroundWorker

The following methods use the thread pool indirectly:

  • WCF, Remoting, ASP.NET and ASMX Web Services Application Servers

  • System.Timers.Timer and System.Threading.Timer

  • Framework methods ending in Async, such as framework methods on WebClient (event-based asynchronous pattern) and most BeginXXX methods (asynchronous programming model pattern)

  • PLINQ

When using pool threads, you need to pay attention to the following points:

  • There’s no way to set the name of the pool thread, which makes debugging more difficult (although you can attach instructions while debugging in Visual Studio’s Threads window).

  • Pool threads are always background threads (this is usually not a problem).

  • Unless you call ThreadPool.SetMinThreads (see Optimizing Thread Pools), blocking thread pools can trigger additional delays in the early stages of your application.

The priority of a pool thread can be freely changed – it will return to normal when it is released back to the pool.

You can query whether it is currently executing on the thread pool through the Thread.CurrentThread.IsThreadPoolThread property.

12.1, enter the thread pool through TPL

The Task class in the Task Parallel Library can easily use the thread pool. The Task class was introduced by Framework 4.0. If you are familiar with the old structure, consider replacing ThreadPool.QueueUserWorkItem with the Task class without generics, and the generic Task represents An asynchronous delegate. The new structure is faster, more convenient, and more flexible than the old one.

Use the Task class without a generic example, call Task.Factory.StartNew, and pass a delegate of the target method;

static void Main() // The Task class is in System.Threading.Tasks
{
    var task=Task.Factory.StartNew(Go);
    Console. WriteLine("main");
    task. Wait() ;
    Console. WriteLine(task. Result);
    Console. ReadLine();
}
static string Go()
{
    if (Thread. CurrentThread. IsThreadPoolThread)
    { Console.WriteLine("Hello from the thread pool!"); }
    else { Console.WriteLine("Hello just from the thread!"); }
    return "task complete!";
}

Output result:

main
Hello from the thread pool!
task complete!

12.1.1, Task exception capture

static void Main() // The Task class is in System.Threading.Tasks
{
    var task=Task.Factory.StartNew(Go);
    Console. WriteLine("main");
    try
    { task. Wait(); }
     catch (Exception e)
    {
        Console.WriteLine("exception!");
    }
    Console. WriteLine(task. Result);
    Console. ReadLine();
}
static string Go()
{
    if (Thread. CurrentThread. IsThreadPoolThread)
    { Console.WriteLine("Hello from the thread pool!"); }
    else { Console.WriteLine("Hello just from the thread!"); }
    throw null;
    return "task complete!";
}

As a result of the operation, exceptions from other threads were caught in the main thread:

778ef407a5fe53273a11b3b841d60494.png

static void Main()
{
  // Start the task executing:
  Task<string> task = Task.Factory.StartNew<string>
    ( () => DownloadString ("http://www.linqpad.net") );
  // We can do other work here and it will execute in parallel:
  RunSomeOtherMethod();
  // When we need the task's return value, we query its Result property:
  // If it's still executing, the current thread will now block (wait)
  // until the task finishes:
  string result = task.Result;
}
static string DownloadString (string uri)
{
  using (var wc = new System.Net.WebClient())
    return wc. DownloadString (uri);
}

Task is an asynchronous delegate whose return value is string.

12.2. Enter the thread pool without TPL

If your framework is before .Net 4.0, you can not enter the thread pool through the Task Parallel Library.

12.2.1, QueueUserWorkItem

static void Main()
{
    ThreadPool. QueueUserWorkItem(Go);
    ThreadPool. QueueUserWorkItem(Go, 123);
    Console. ReadLine();
}
static void Go(object data) // data will be null with the first call.
{
    Console.WriteLine("Hello from the thread pool! " + data);
}

operation result:

Hello from the thread pool!
Hello from the thread pool! 123

Different from Task:

  • The execution result cannot be returned in subsequent executions;

  • Unable to return an exception to the caller;

12.2.2 Asynchronous delegate

The delegated EndInvoke does 3 things:

  • block waiting;

  • return result;

  • Throw an exception to the caller;

12.3, thread pool optimization

A thread pool starts with one thread in its pool. After assigning tasks, the pool manager “injects” new threads to handle the additional concurrent workload (up to a limit). After a sufficiently long period of inactivity, the pool manager may “retire” a thread if it suspects that doing so will result in better throughput.

An upper limit on the threads the pool will create can be set by calling ThreadPool.SetMaxThreads; the default is:

  • 1023 in Framework 4.0 in a 32-bit environment

  • 32768 in Framework 4.0 in a 64-bit environment

  • 250 per core in Framework 3.5

  • 25 per core in Framework 2.0

You can also set the lower limit by calling ThreadPool.SetMinThreads. The effect of the lower bound is subtle: it is an advanced optimization technique that instructs the pool manager not to delay the allocation of threads until the lower bound is reached. Increasing the minimum number of threads improves concurrency when there are blocked threads.

The default lower bound is one thread per processor core – the minimum allowed for full CPU utilization. However, on server environments such as ASP .NET under IIS, the lower limit is often much higher – as many as 50 or more.

Set the minimum number of threads in the thread pool.

ThreadPool. SetMinThreads (50, 50);

Git code for this article:https://github.com/JerryMouseLi/Thread.git

Technical group: Add Xiaobian WeChat and comment into the group

Editor WeChat: mm1552923

Public number: dotNet Programming Daquan