I never imagined hearing so many plans for people to be dead within 2 decades when I started this!
The issue isn't so much that IP cams from today don't support dates past 2038, but that developers of those systems remain unaware and uncaring for many years to come. It is probable that devices which don't support larger dates continue to be sold well into the 2030s and that millions upon millions of said devices will still be operational and useful on that date.
Well said, but there will likely be cameras from now that are fully functional in 17 years. The progression of technology slows over time. For single core CPU performance development, the amount of progress that took 2 years in the mid 90s took 10 years from 2010 to 2020. Granted digital camera technology is closer to GPU development than CPUs, which continues to make much more progress, but everything will slow down.
I think the h264 (or h265) video codec, which all modern IP cameras use, was a bit of a chicken and the egg problem. There needs to be a hardware encoder IC to make it practical, and there needs to be either fast CPUs or hardware decoders to view it, and there needs to be high resolution screens to display it. If either of the last two things are lacking, then there isin't a reason to justify the development of a hardware encoder IC, or the codec itself. A hardware encoder IC is nothing new. There just wasn't a reason to justify developing it when most users didn't have high resolution displays or general purpose CPUs capable of decoding it. It wasn't until 2007 with the Intel core2duo with its double speed SIMD units that a computer could even decode a 1080p30 movie. Prior to that, a hardware decoder IC would be needed to view 1080p30 content encoded in h264. There just wasn't a need for this when MPEG2 or MSMPEG4 or DiVX could do it for a fraction of the CPU requirement, but at a higher bitrate. There was nothing stopping the use an h264 type codec in the year 2000, if h264 hardware existed for encoding and decoding. So it's not that video codec relied on 2003 technology (the year when the h264 specification was finished) or later to make it possible. The reason is that the world wasn't ready for it, as the other things that it depends on were not yet common place in homes. Ironically, it has been about 17 years since the h264 specification was released, and only recently have IP cameras started moving to h265. We're currently just moving away from using 17 year old technology in cameras that are supposed to be long obsolete 17 years from now.
So with that said, there is really no reason to believe that any huge development in terms of camera video codecs will happen in the next 17 years. Resolutions will get higher and higher, but will data storage prices be reduced to match that? There is a limited amount of bandwidth available for wireless cameras to use, given the amount of sharing of the limited spectrum. Several 1080p30 cameras already reach the limit of wireless speeds unless it is at short range or you have no interference from neighbors.
I have my 802.11g wireless router from 2005, and it works just fine 16 years later. In fact the range is better than many of the N access points that have different antenna arrangements to be optimized for higher speeds rather than range.
Almost nothing was affected by Y2K. The world was just becoming digital and most computer systems were less than 10 years old, and shipped being Y2K compliant. Y2K38 is a completely different situation.
What exactly is going to improve on an IP camera in the next 10 years that will justify replacing it?