6

I would like to create a ISO file for testing optical drives with the method mentioned here. I want to create an ISO file filled with random data, so I can burn it to a CD, read it back and compare with the original ISO. I think the command bellow will do what I want.

dd if=/dev/urandom of=cd-rom_test.iso bs=2048 count=333000

However, I am unsure of a few things.

First of all what should the block size be? Do ISO images include the header and error correction data that is on a CD? Should the block size be 2048 (without header and error correction) or 2352 (with header and error correction)?

Secondly, a few sites I have come across have sync in the conv option (example here). Which as far as I can see fills the header and error correction information with 0s, which surely isn't right.

  • 1
    In the examples on the page you've linked, the noerror and sync option is to deal with read errors on the device being read. For example, if you had a scratched CD and you read with noerror,sync -- the error blocks on from input device would be zeroed on the output file. – Sean C. Sep 14 '12 at 14:34
  • Ok, I understand better now. Thanks a lot. – robingrindrod Sep 14 '12 at 14:36
  • I am a bit confused as to what exactly you're trying to do and why you need an ISO with randomly generated data instead of using a regular ISO like Linux CD Image? – Karlson Sep 14 '12 at 14:42
  • @Karlson I want a disc that is completely full to test the optical drive. Random data seems a quick and easy way of achieving this. – robingrindrod Sep 14 '12 at 15:09
  • 1
    333000 is the number of sectors for a 650 MB CD. A 700 MB CD is made of 360000 sectors. –  Sep 15 '12 at 10:11
  • 1
    Also the block size one should use is 2048, not 2352, because the ISO 9660 filesystem is not written in the same bytes reserved for header informations and error correcting data. –  Sep 15 '12 at 10:49

3 Answers3

8

Based on what you have described you should do something like this:

dd if=/dev/urandom of=testfile bs=1M count=699
mkisofs -o test_cdrom.iso testfile

Once done you can read and write to and from the optical media to your hearts content.

One thing that I would suggest is that instead of pretesting the optical media and then attempting to write the actual ISO and then verifying that you might as well write the target data ISO once, since it will give you the exact same result as you are trying to achieve without spending time on Write->Read->Verify->Format

Karlson
  • 5,875
2

Since you're writing to a file with dd, the block size will not change the resulting output. Block size only matters when writing to devices.

You're reading from a device that won't ever return short reads, so you don't need to use sync to pad blocks.

In any event, dd does not have read or write access to the error correction data. As with magnetic media, the drive manages that data and you see only the error-corrected blocks.

Kyle Jones
  • 15,015
  • Thanks for you help. It makes much more sense now. The reason I have specified the block size is because I specified count, so the block size will influence the amount of data I get, won't it? – robingrindrod Sep 14 '12 at 15:27
  • 1
    Even when writing directly to a CD, the block size is only a matter of performance. The block size matters only when writing to tape devices. – Gilles 'SO- stop being evil' Sep 14 '12 at 23:35
0

Creating an image made of random bytes is not a good idea, because it results in an obviously invalid ISO 9660 filesystem. You should instead create a large random file that can still be put inside a valid ISO filesystem. Then you burn this ISO image and read the content of the CD for comparison with the original file on the hard disk, for example with either md5sum or sha1sum. If the checksums are the same, then your drive is OK at both writing and reading CDs.

  • 1
    Much appreciate your help. Would you mind clarifying why it matters if the data is a valid ISO 9600 file system. All I am really interested in is the raw data and whether it is read correctly. – robingrindrod Sep 15 '12 at 11:52
  • 1
    At first I thought that a CD burning software could refuse to write an invalid ISO image, but it seems it is not the case. Just don't append to the random image file the .iso suffix, because it's not an ISO 9660 filesystem. Anyway if you compare ISO images instead of the content of ISO images you may have minor problems with trailing zero bytes (see this for details). –  Sep 15 '12 at 12:37
  • Thanks for the tip about the trailing zeros. Do you have any idea what causes this? Also do you have a source to verify that ISO images can only contain ISO 9660 filesystems, as the Wikipedia page states that "The name ISO is taken from the ISO 9660 file system used with CD-ROM media, but what is known as an ISO image might also contain a UDF (ISO/IEC 13346) file system or a DVD or Blu-ray Disc (BD) image." – robingrindrod Sep 15 '12 at 13:09
  • Unfortunately I don't know what causes the presence of trailing zeroes. As for images other than ISO 9660 filesystems, I have never seen them. But please consider that I'm not an expert. So let's wait and see if somebody else knows more than us. –  Sep 15 '12 at 20:29
  • This post mentions that the trailing zeros might pad out the last block if the data does not fit evenly into the blocks. Is this what you were referring to or have you experienced extra blocks filled with zeros? – robingrindrod Sep 16 '12 at 13:24
  • I never seen an ISO image whose size was not a multiple of the block size (2048 bytes). The problem is I encountered extra blocks filled with zeroes. Anyway even ISO images for live CD Linux distributions such as Ubuntu or Arch Linux come with zeroes at the end (hundreds of them). I don't know if it is normal. Just be sure to know this if you have a sha1sum/md5sum mismatch. –  Sep 16 '12 at 20:33