What Two AIs Saw in My Study
The biggest change in chat models over the last two years is not the one most articles focus on. It is that they can now actually look at things. Image input went from a feature you would test once and forget about to something I use weekly without thinking about it, and the gap between describing a problem in words and just showing the model what you are looking at turns out to be much bigger than I expected. The first time it really landed for me was a few years ago when I started feeding ChatGPT photos of error screens and bits of hardware I could not be bothered to describe. By the time photo-based questions felt routine I had built up enough trust in ChatGPT specifically that vision tasks became one of the things I would default to it for, even as Claude took over for almost everything else I do. ...