• caglararli@hotmail.com
  • 05386281520

May read() syscall not set first bytes of a buffer?

Çağlar Arlı      -    8 Views

May read() syscall not set first bytes of a buffer?

I'm working on some pwn.college binary exploitation challenges.

ASLR is disable, stack is executable and there is no canary. I'm not understanding one thing.

I have my shellcode which open the flag file and prints the content to stdout and it works. When I inject it into the stack buffer everything seems fine but looking at the bytes i've noticed that the first byte of the buffer is not being set.

The destination buffer is at address 0x7fffffffd2c0 and the vulnerable program initialize it to 0 at the beginning of the function.

Here the buffer content before the call to read(): enter image description here

Just after the call the content is this one: enter image description here

As you could see the first byte has not been changed and in this particular run it should have been set to 0x90 (NOP).

Hexdump of my shellcode is the following:

enter image description here

I've been stuck with the same problem in the previous challenge but after compiling the shellcode on target machine and not on my pc everything worked, so I though of some differences in the environments. But this time I did everything inside the lab.

This problem leads to an "invalid instruction" or "segfault" when the execution jumps to the beginning of the buffer as the first byte mess all. The read() returns that it has read 80 bytes from stdin which is the right length of my shellcode.

Is there a reason why a call to read() may not set some bytes, especially the first one?