miguelmpn February 2016

c# Buffer explanation

This might be a really beginer's question but I've been reading about this and I'm finding it hard to understand.

This is a sample from the msdn page about this subject (just a little smaller).

using System;

class SetByteDemo
{
    // Display the array contents in hexadecimal.
    public static void DisplayArray(Array arr, string name)
    {
        // Get the array element width; format the formatting string.
        int elemWidth = Buffer.ByteLength(arr) / arr.Length;
        string format = String.Format(" {{0:X{0}}}", 2 * elemWidth);

        // Display the array elements from right to left.
        Console.Write("{0,7}:", name);
        for (int loopX = arr.Length - 1; loopX >= 0; loopX--)
            Console.Write(format, arr.GetValue(loopX));
        Console.WriteLine();
    }

    public static void Main()
    {
        // These are the arrays to be modified with SetByte.
        short[] shorts = new short[2];

        Console.WriteLine("Initial values of arrays:\n");

        // Display the initial values of the arrays.
        DisplayArray(shorts, "shorts");

        // Copy two regions of source array to destination array,
        // and two overlapped copies from source to source.
        Console.WriteLine("\n" +
            "  Array values after setting byte 1 = 1 and byte 3 = 200\n");
        Buffer.SetByte(shorts, 1, 1);
        Buffer.SetByte(shorts, 3, 10);

        // Display the arrays again.
        DisplayArray(shorts, "shorts");
        Console.ReadKey();
    }
}

SetByte should be easy to understand, but if I print the shorts array before doing the SetByte operation the array looks like this

{short[2]}
    [0]: 0
    [1]: 0

After doing the first Buffer.SetByte(shorts, 1, 1); the array becomes

{short[2]}
    [0]:        

Answers


taffer February 2016

The .NET types use little endianness. That means that the first byte (0th, actually) of a short, int, etc. contains the least significant bits.

After setting the array it seems like this as byte[]:

0, 1, 0, 10

As short[] it is interpreted like this:

0 + 1*256 = 256, 0 + 10*256 = 2560


Gusman February 2016

The Buffer class allows you to manipulate memory as if you were using a void pointer in c, it's like a sum of memcpy, memset, and so on to manipulate in a fast way memory on .net .

When you passed the "shorts" array, the Buffer class "sees" it as a pointer to four consecutive bytes (two shorts, each of them two bytes) :

  |[0][1]|[2][3]|
   short  short

So the uninitialized array looks like this:

 |[0][0]|[0][0]|
  short  short

When you do Buffer.SetByte(shorts, 1, 1); you instruct the Buffer class to change the second byte on the byte array, so it will be:

|[0][1]|[0][0]|
 short  short

If you convert the two bytes (0x00, 0x01) to a short it is 0x0100 (note as these are the two bytes one after other, but in reverse order, that's because the C# compiler uses little endianness), or 256

The second line basically does the same Buffer.SetByte(shorts, 3, 10);changes third byte to 10:

|[0][1]|[0][10]|
 short  short

And then 0x00,0x0A as a short is 0x0A00 or 2560.


garglblarg February 2016

i think the part that people might struggle with is that the Buffer.SetByte() method is basically iterating over the array differently than a regular assignment with the array indexer [], which would separate the array according to the width of the containing type(shorts/doubles/etc.) instead of bytes... to use your example: the short array is usually seen as arr = [xxxx, yyyy](in base 16) but the SetByte method "sees" it as: arr = [xx, yy, zz, ww]

so a call like Buffer.SetByte(arr, 1, 5) would address the second byte in the arry, which is still inside the first short. setting the value there and that's it. the result should look like:

[05 00, 00 00] in hex or [1280,0].

Post Status

Asked in February 2016
Viewed 2,020 times
Voted 4
Answered 3 times

Search




Leave an answer