A few days ago I received/installed software update 8.1 (2018.21.9 75bdbc11). With my car's Autopilot v1 system, I don't get a whole lot of releases these days so anything new is exciting (and comforting in that Tesla still remembers us old-generation folks). There wasn't much listed in the release notes so I assumed it was just a small collection of bug fixes. It turns out that Autopilot notification behavior towards the driver was updated.
It was during the drive from the Bay Area to Palm Springs when I first noticed the change. While Autopilot was enabled, the top inside edge of the instrument cluster flashed. Previously when the system hasn't sensed sufficient human steering wheel force (representing "hands-on-the-wheel"), the entire edge of the display flashes. Adding a bit of extra rotational torque against the wheel made it go away as usual.
The way I hold the steering wheel (with hands at the 3 and 6 o'clock positions) isn't enough for Autopilot to recognize that human hands are placed. Either Autopilot's sensitivity threshold needs to be increased or Tesla expects humans to have a stronger hold on the wheel as opposed to my usual lightweight resting. This is most prominent when the roadway is straight and Autopilot is doing its job keeping the car centered in the lane. There is little to no adjustment needed from me in this scenario.
In addition, the nag occurs more frequently and seems much more likely to happen when the cruising speed is above a threshold (maybe 60 - 70mph).
There's a fine line between keeping this level of force applied with your arms and accidentally applying too much torque that it tugs/jerks the steering wheel out of Autopilot control, requiring the driver to immediately countersteer again to align the movement forward in the intended direction. This topic is starting to be discussed in the Tesla forums and other media outlets. Presumably this change was made to alleviate concerns about Autopilot safety given recent events where its track record have come under fire from multiple public entities.
That said, I think Tesla still needs to refine this a bit.
We're living in a time where new technologies like Autosteer are in a transition period of human acceptance, abuse of those technologies, and the debate between regulation and delivering cutting-edge/it's-the-future features. Much of issue I'm sure stems from drivers testing Autopilot's intended limits by assuming Autosteer is smart enough (it isn't) or treating the system with an excessive sense of casualness.
It's important to remember that Autopilot is a driver's assistance system that's not autonomous. It's not self-driving. The human driver behind the wheel is ultimately responsible to monitor/maintain control of the car's actions. That said, if you crash your Tesla while Autopilot is enabled and your hands were on the wheel the whole time and yet the system didn't recognize it as so, Tesla is likely going to declare that the driver's hands were off the wheel. This seems rather unfair unless Tesla is more specific as to the amount of effort the driver is supposed to place on the wheel or has better recognition systems in place.