In a previous blog, we discussed the key advantages of BOXX bringing AI to the desk side, and among those mentioned was financial. Although it was last on the list of advantages, it may very well be the most important.

For most serious AI workloads, the amount of data processed demands a cloud based service, and that service comes at a significant price. Each new workload requires compute nodes and memory nodes in order to run. The more complex the application (or task) required, the more resources it will consume. Cloud providers charge by the number of cores and gigabytes of memory you use per hour, and those cloud resources you or your organization are paying for are not infinite. With that being the case, you don’t want to consume valuable cloud resources if it hinders the productivity of more important applications and/or is not cost effective.

Add to that the cost of uploading and/or downloading storage in the cloud and it could soon run into a substantial expense. Of course, your cloud service for AI will save time once the application has been developed (and be much faster), but the development of said AI app on the cloud would be economically unadvisable.

So, it’s understood that the sheer processing power required for some AI applications means at some point you’re going to have to take your AI workload to the cloud. But for the sake of this discussion, the operative phrase is, “at some point.”

You see, the early stages of your AI workflow may involve experimenting with different AI models and algorithms and this can be an extremely time consuming process resulting in multiple iterations. As mentioned previously, time is most definitely money when paying for a cloud-based AI service, so imagine a cost effective way of developing and training your AI models before incurring the cost of a cloud service. The solution is an AI desk side workstation. On the entry level, BOXX offers the essential-class APEXX W3, a liquid-cooled subcompact powered by an Intel Xeon W5-2465X processor and a single NVIDIA RTX 6000 Ada GPU with 128GB of memory.

If your AI workload requires more compute power, the advanced-class APEXX W4, a liquid-cooled workstation powered by an Intel Xeon W7-3455 processor, and available with up to two NVIDIA RTX 6000 Ada GPUs and 256GB of memory, is the optimal choice. With either one of these workstations, or the BOXX RAXX W3 rack mounted system (Intel Xeon W9-3475X processor, four NVIDIA RTX 6000 Ada Generation GPUs, and 512GB of memory), you could develop and fine tune your model before ever committing the process (and your dollars) to the cloud.

For example, suppose you wanted to tackle an unstructured task which involves attributes, volume, and tone—like identifying and compiling all images of red-haired television characters from 1990-1999. That could eventually mean churning through hundreds of terabytes of video data, i.e., lots of time and lots of money. With your desk side workstation (as well as a recommended, economically feasible storage solution), you could essentially do an AI test drive without paying for the time it takes to determine and refine the questions, provide sample data, copy locally, and fine tune the development of the model.

Of course, there is the upfront cost of purchasing a desk side workstation, but when you’re in the business of AI workloads, you’ll eventually achieve significant ROI. To learn more, visit the BOXX AI workstations solutions page, or talk to a BOXX performance specialist via chat or by calling 877.877.BOXX.