GPU programming is a mess. It relies on frameworks that are tied to specific devices, incompatible shading languages, and drivers that can sometimes cause problems. But WHY is it so bad? After all, CPUs are much more convenient to program. Even though there are multiple architectures on the market, CPU programs can somehow be compiled pretty easily with GCC or LLVM to target multiple platforms. You cannot really expect this level of compatibility with GPUs: all the cross-platform frameworks have drawbacks. This video explains how we ended up in this situation. It's mostly because of how the market evolved, but there are also technical and legal factors at play. The video also explains how government regulations and consumers can influence industries to improve technology. Maybe GPU programming will get better! Script: References: #references GitHub repository: Last video on the same topic: Chapters: - Introduction: 0:00 - I. CPU Programming: 0:46 - II. GPU Programming: 5:15 - III. Antitrust: 11:35 - IV. Can It Get Better: 14:35











