This topic provides advice for achieving maximum performance. Manifold is so fast by default that on most computers performance will be superb for typical tasks. For larger data, very large projects, or more complex work, keeping in mind a few tips will help provide the fastest possible performance.
Manifold is designed for an era where human time is expensive but processor cores, GPU cores, memory, and storage in the form of disk drives or SSD are cheap. Manifold therefore makes heavier use of cheap resources to economize on the most expensive part of GIS: our time.
That's why Manifold .map files will generally be significantly larger than the sum of the file sizes of files that have been imported into the .map file: by pre-computing essential structures and storing data for speed rather than compactness, .map files provide much faster performance, and save us a lot of time.
Manifold is very fast on almost any reasonably recent Windows system. For the fastest possible performance consider the following tips:
Recent versions of Windows such as Windows 10 are much faster and more stable than earlier versions of Windows. Not only Windows itself is faster but also accessory subsystems, such as rendering subsystems, that are used by Manifold are also faster, sometimes much faster, than earlier subsystems.
Modern Windows editions provide many power saving options, even on desktop machines that are plugged in all the time. Surprisingly, the default choice might be a Balanced plan that reduces performance to reduce energy consumption. By running the processor slower or by turning off disk drives a power plan can significantly reduce energy consumption, but at the cost of significantly reduced computer performance.
For maximum performance with Manifold, make sure your power plan is set to High performance. Ensure that in whatever power plan settings have been specified, all disk drives and other devices that you will be using are never turned off, since the time to turn on a disk drive and spin it up for full readiness can result in dramatically slower operation. External, portable drives, or external drives connected via USB, for example, are often turned off to save power after a period of inactivity.
Virus checkers and other programs running in background can be performance killers. Learn how to prevent them from checking and re-checking the work you do in Manifold.
Manifold is automatically CPU parallel, running on all the cores we have in our CPU or CPUs. If we have a 16 core CPU, Manifold will use all 16 cores. More CPU cores usually is better than faster, but fewer, cores. In many tasks, having a CPU with many cores that run at average speed is better than having a CPU with fewer cores which run faster. The classic example is installing an inexpensive 8 core processer that ends up delivering better performance than an expensive 4 core processor. Manifold runs great on both AMD and Intel processors. In most cases, 12 or 16 cores (24 to 32 hyperthreads) are ideal for GIS work. Going beyond 24 cores results in diminishing price/performance returns, since in most cases efficient use of 32 hyperthreads will get the job done before more threads can be efficiently used.
Manifold utilizes NVIDIA GPUs for GPGPU processing to achieve much better performance through massively parallel computation, whenever such computation makes sense. When GPU makes a difference almost any recent vintage NVIDIA GPU delivers big results. Therefore, all systems for Manifold should have at least some GPGPU-capable NVIDIA GPU in them. Even the cheapest GPGPU-capable card is much better than none. GPU parallelism requires an NVIDIA GPU.
Modern GPUs are so fast, however, that even with Manifold parallelism only exceptionally computationally intensive, unusually large jobs will be limited by GPU performance. If we purchase a "good value" GPU card instead of the most expensive, latest and hottest GPU card we usually will get the best price/performance ratio. If funds are totally unlimited then, of course, we can spend big and buy one or more of whatever is the latest, hottest GPU card. But in most cases having one or two relatively inexpensive but also relatively recent GPU cards will provide performance that is as good.
Another reason not to overspend on GPU is that Manifold makes exceptionally good use of parallel, manycore CPU computations. Using manycore CPU efficiently in parallel computations leaves fewer cases when it makes sense to dispatch to GPU. A better balance point, therefore, is to have a 12 or 16 core CPU with an average GPU, instead of a 4 or 8 core CPU with a very expensive GPU.
When RAM does not suffice Windows will page out to disk. That's a performance killer.
We must be running 64-bit Windows and 64-bit Manifold to take advantage of lots of RAM.
We must have enough free space on SSD or disk to host the size of files with which we work.
We must have enough free space on SSD or disk for pagefiles and temp files to host them as well.
Temp files can expand to three or more times the size of the actual project.
Compressed formats can expand to many times the data size when data is decompressed into working form.
Although Manifold has remarkably good performance in systems which are under-equipped with memory, the system can do better if we have installed lots of cheap memory in our computer. Think big: install maximum memory that your system can host, if you can afford it, although usually 32 GB is fine for most cases, although of course more usually will help when working with projects that are larger than that. Memory is cheap. Your time is expensive. Burn memory, not time.
If we have lots of memory in our system Manifold can minimize the need to utilize much slower SSD or disk drive storage. Secondary storage will still be used: there is no getting around the need to get data off SSD or disk and into memory. No matter how fast Manifold may be Windows will impose a limit on how fast secondary storage can perform. Therefore it pays to use solid state disk, SSD, drives for maximum speed, and to use faster disk drives in older systems if we do not have SSD. Larger disk drives are faster than smaller ones, plus having a larger disk drive saves time by not having to wonder if we have enough space to make iterative backups and other time-saving workflow. Disk drives with many terabytes have become cheap. Install plenty of cheap, disk storage so you always have plenty of free space for archival storage and backups.
Given at least four cores in the CPU, priority should go to having a reasonable amount of RAM (16 GB or so, more is better), a fast, M.2 SSD for primary storage, and then spending more on CPU cores to get 12 cores. Dialing RAM up to 24 or 32 GB is a good idea as well, if we cannot afford more.
Older installations of Manifold provided both 64-bit and 32-bit versions of Manifold. After many years of supporting 32-bit versions, in 2022 Manifold dropped support for 32-bit versions and now ships exclusively 64-bit Manifold versions in all new builds. If you are using an out-of-date, 32-bit version of Manifold, switch to working with 64-bit Manifold in 64-bit Windows on a 64-bit computer.
Read Manifold documentation, watch videos, participate in the forum, and learn as much as you can so you are always using optimal methods.
Learn about your Windows system, including all the background junk software that is running. Learn how to minimize the performance impacts of background programs such as anti-virus and similar applications.
Take a few minutes to think about a task before launching into it. There are often many different paths to the same end within Manifold. A good plan will help you choose the best path and avoid unnecessary work.
Don't fall into the trap of making projects more complicated than they need to be. Most GIS projects follow the 80 / 20 rule, where 80% of the desired result comes from 20% of the implementation effort. Do that 20% first and then try it out. You might find that you are happy with the result.
Try to find a pre-existing data set that provides what you need before you invest time into building your own. Some people waste weeks of time creating new maps without realizing they could have downloaded a pre-built map from some free site.
Try to find the data you need in modern formats that automatically convey all necessary information, such as projections, in a user-friendly way.
Use web servers for background maps. Do not re-invent the wheel.
Save your work regularly in a sequence of projects so you never have to waste time redoing an entire project after a power failure or user error at some stage. Have lots of free disk space, many terabytes, so you never have to think twice about doing a save.
Learn keyboard shortcuts and use them in combination with mouse moves. For example, skilled users will keep the left hand on the keyboard for CTRL-C and CTRL-V to Copy and Paste while the right hand moves the mouse to select items and manipulate windows and other mouse-based controls.
Learn to write SQL queries. Simple SQL is absurdly easy to learn, much easier than thought by beginners, and you will get a lot of use out of really simple SQL that can be quickly learned. A short snippet of SQL or a few lines of SQL opens the door to immense power, custom capabilities and time saved. SQL is huge bang for the effort. Advanced users should invest time into feeling comfortable with more advanced SQL,
Advanced users should consider learning to write scripts. Automating a task so that it takes care of itself while you are at lunch or at home is a wonderful time saver. At times a very simple script or query can replace a long sequence of commands using pre-built tools. Write the simplest code that works and use it. Write scripts and queries so their internal functioning is obvious. Include plenty of comments to help you remember what the code does when you need to change it months or years later.
When seeking help on user forums or from tech support, provide details on your system configuration, your project, where everything is stored and the step by step process you are using.
Sometimes you may be called upon to do a job under time pressure. The more time pressure you feel to complete a project, the more important it is to work systematically, steadily and carefully. Don't panic. Take it step by step in a steady pace. If you are short on time you don't have time for errors. Measure twice, cut once.
Get plenty of sleep, eat light, and exercise regularly. Fatigue causes errors and panic. Good health will help you think clearly and execute with authority.
The user is almost always the key factor in performance. The greatest gains in performance are usually achieved by using smarter, more skilled workflow, not by throwing money at faster hardware. More often than not the sole factor in whether a better method is used is the expertise and clarity of mind that can be mustered by the user. A healthy, well-rested, expert user is the best performance accelerator around.
Tech Tip: In queries use THREADS to make full use of processor cores available. This is automatically built in when using templates in the Transform pane, but it is something we must add when writing queries from scratch.
The THREADS command takes a value for the number of threads to use. For example, if we know we have six CPU cores but we only want to use four threads we could write...
THREADS 4
To automatically use all CPUs we have available we can add...
THREADS SystemCpuCount()
...to the end of a query we are telling Manifold to see how many CPUs are available, the result of SystemCpuCount(), and to use that many threads. The result can be dramatic, literally running a query several times faster than without launching multiple threads.
It should go without saying that we should have enough free space on disk (meaning, either SSD or hard disk) to work with the size data desired. We cannot work with 100 GB data if we have only 20 GB free space on disk. That much is obvious but there are nuances that might be easy to forget. For example, we may use a faster, smaller SSD drive for TEMP folder and pagefile storage. In such cases it is easy to forget that while we may have ample free space, terabytes, on our primary disk we might have a very limited amount of free space, only a few gigabytes, on the SSD drive we use for TEMP storage. Both Windows and Manifold make heavy use of TEMP storage and, potentially, pagefile storage so if there is not enough space on the SSD drive we might not have the free disk space resources to work with the size project desired.
As a general rule of thumb, many Manifold users like to have three times the maximum size of the project available in TEMP storage free space. That is larger than necessary for simple operations but may come into play with more complicated tasks.
Another nuance is forgetting that compressed file formats can result in much larger data sizes when the data is decompressed into working form. A classic example are image formats such as TIF or LiDAR storage formats which often store data in compressed form. Compressed formats may be compact, but they are very slow and in any event the data will be decompressed into the true, binary form for use within Manifold. Compression factors of 10 or 20 are common, so that a 10 GB file in compressed form might decompress into 100 GB or 200 GB of real data. Multiply by three (an extreme case) and we might realize that we do not have 600 GB of free space on disk available.
The strategy to deal with such issues is simple and inexpensive: disk drives providing many terabytes of storage are now cheap. Multi-terabyte SSD drives have also become inexpensive. Use larger disk drives so that always several terabytes of free space are available even when working with very large data. Free space on disk drives is absurdly inexpensive compared to the cost of our time. Spend disk drive space, not time.
Manifold can seem to work miracles even with slow formats. For example, Manifold .MAPCACHE technology allows Manifold to link large ESRI shapefiles or MapInfo MIF files into a project and to open and render them even faster than ESRI's own ArcGIS shapefile rendering software or MapInfo's own software. At times Manifold can even render and edit large linked shapefiles or MIF files with speed approaching that of Manifold's own .map format.
Despite such magic, nothing is as fast as Manifold .map project file format. It takes time to import data from some other format into a .map project, but the rule of thumb is that if the data is worth working with, it is worth importing and saving in a .map file.
In most cases the fastest possible setup is to import data into a project and to work with the .map project file in local storage, such as a fast, local SSD. Manifold working with Manifold .map format usually runs faster than even SSD can serve data, so in the case of projects involving lots of data the faster the SSD or disk storage, the better. Even better is to have lots of cheap main memory so the system does not have to touch disks drives as often.
Manifold can also do read/write work with many data formats without importing the data but working with it while it is linked from and still residing in the original file format. That is convenient for casual use like editing data "in place" in the original file but it will not be as fast as native Manifold .map storage.
The convenience of editing an older format "in place" may make the limitations imposed by that format tolerable for small projects; however, as a general rule of thumb it is a better idea to import the data into the Manifold project, save it in a Manifold .map file and then if need be export it back out to the older format. That is true even though there is overhead required in the time necessary to import data into Manifold and then time again required to export the data back to the original format.
Better still is to import the data sets that we may frequently use into Manifold .map format and then work exclusively with .map format as much as possible, exporting to other formats only when necessary to interchange data to someone who does not have Manifold.