Optimal source image size RRS feed

  • Question

  • First, thanks for making these tools available for free, I really appreciate the efforts -lots of goodies in this field (hdmake, ICE, Deep Zoom composer, etc.).

    I'm using hdmake to create a pretty large SL Deep Zoom composition. I started with a large number of small images (>300.000 jpg pictures, 256x256 px each) and I first assembled them in larger blocks with some code I wrote (result: total ~150 images, ~14000x14000 px each) thinking that it would have made the job easier for hdmake (also for ease of transportation as copying around so many files is rather painful).

    Now hdmake seems to be running fine on the large images (always under 900 mb ram, which I think is very good given the image size), but I'm wondering if my intermediate step was necessary or completely useless.
    In other words, it's better to give hdmake a huge set of small images or a small set of very large pictures?


    Wednesday, May 12, 2010 8:58 PM

All replies

  • I haven't tried to experiment with this.  But, I'm guessing that if you add the time to generate the intermediate files plus the time that hdmake runs it will be greater than simply pointing hdmake at the originals. 
    Thursday, May 13, 2010 5:30 AM