by mwayne »
Hi,
So, this worked. I was able to make an arbitrary waveform generator that reads from a binary file, writes to an FPGA over PCIe, and outputs the binary bit pattern over the high speed serial transceivers. We are trying to push the limits of what it can do, and have run into a weird problem.
I have something like this in my code
- Code: Select all
bytesleft = streamsize; //the number of bytes in the binary file
char device_stream[32] = "\\\\.\\xillybus_qfpdstream"; //my xillybus device
if((fd_device = _open(device_stream, O_WRONLY | _O_BINARY)) < 0)
printf("Couldn't open device stream.\n");
fd_stream = _open(filename, _O_RDONLY | _O_BINARY);
while(bytesleft > 0)
{
if(bytesleft >= 500000)
{
bytesread = _read(fd_stream, buf, 500000);
allwrite(fd_device, buf, bytesread);
bytesleft = bytesleft - 500000;
}
{
We change filename to suit our needs, and 90% of the time everything works correctly. However, it seems as if the FIRST time we load up a new file the _read operation is taking much longer than usual and the streaming is interrupted.
i.e., if I want the FPGA to output {file1, file1, file2, file2, file3, file3, file4, file4, file1, file1} then sometimes the first instance of fileX will take longer. Not always, but sometimes. The second, third, etc always seem to take the correct amount of time. If the same file is called again (i.e., file4, file4, file1}, the first file1 will take longer sometimes even though it was used earlier.
I don't know if it matters but we are using file sizes ranging from 0.1 Gb to 100 Gb (bits not bytes), and they all seem to do this.
Oh, buf is declared in the main function and passed to _read. I tried making it a static global, but that didn't do anything.
Is there something weird about _read that I'm missing, or is there some weird caching thing where once a file is loaded into RAM windows knows its there and doesn't load it again?
Thanks,
Michael
Hi,
So, this worked. I was able to make an arbitrary waveform generator that reads from a binary file, writes to an FPGA over PCIe, and outputs the binary bit pattern over the high speed serial transceivers. We are trying to push the limits of what it can do, and have run into a weird problem.
I have something like this in my code
[code]bytesleft = streamsize; //the number of bytes in the binary file
char device_stream[32] = "\\\\.\\xillybus_qfpdstream"; //my xillybus device
if((fd_device = _open(device_stream, O_WRONLY | _O_BINARY)) < 0)
printf("Couldn't open device stream.\n");
fd_stream = _open(filename, _O_RDONLY | _O_BINARY);
while(bytesleft > 0)
{
if(bytesleft >= 500000)
{
bytesread = _read(fd_stream, buf, 500000);
allwrite(fd_device, buf, bytesread);
bytesleft = bytesleft - 500000;
}
{
[/code]
We change filename to suit our needs, and 90% of the time everything works correctly. However, it seems as if the FIRST time we load up a new file the _read operation is taking much longer than usual and the streaming is interrupted.
i.e., if I want the FPGA to output {file1, file1, file2, file2, file3, file3, file4, file4, file1, file1} then sometimes the first instance of fileX will take longer. Not always, but sometimes. The second, third, etc always seem to take the correct amount of time. If the same file is called again (i.e., file4, file4, file1}, the first file1 will take longer sometimes even though it was used earlier.
I don't know if it matters but we are using file sizes ranging from 0.1 Gb to 100 Gb (bits not bytes), and they all seem to do this.
Oh, buf is declared in the main function and passed to _read. I tried making it a static global, but that didn't do anything.
Is there something weird about _read that I'm missing, or is there some weird caching thing where once a file is loaded into RAM windows knows its there and doesn't load it again?
Thanks,
Michael