[Solved] Cast a Char pointer to UINT32


I’m going to guess that your code is calling an API that has a 32-bit opcode as the first parameter, and the author of the API has chosen an opcode such that if he did a combined hex/ASCII dump of your array he would see the ASCII letters “SELF”. For an example of this approach see the nginx configuration code: https://github.com/nginx/nginx/blob/master/src/core/ngx_conf_file.h

#define NGX_CORE_MODULE      0x45524F43  /* "CORE" */

The string “CORE” is in a comment for reference, but the code never uses an actual string. The symbol is an unsigned 32 bit integer and can be used anywhere an int is required. However the 4 bytes of the int (reading backwards – little endian) are 0x43 (‘C’), 0x4F (‘O’), 0x52 (‘R’) and 0x45 (‘E)

I recommend you follow the same approach – instead of creating an ASCII string “SELF” and doing the math to convert it to an integer, let the compiler do the work.

#define SELF_CODE  0x464C4553   /* "SELF" */ 

Using this new constant, rewrite your code to avoid using any strings:

void main ()
{
    do_this (SELF_CODE, 18, 100);
}

void do_this(UINT32 comm , UINT32 num , UINT32 value)
{
    UINT32 inl_values[13] ;

    inl_values[0] = comm ;
}

solved Cast a Char pointer to UINT32