This is a bigger problem than tests.
You mean things going over estimates or SM/EM complaining about it?
You’re presenting a solution for a problem that the team either does not see as important or doesn’t think exists at all.
Definitely it's a known issue, and I think people think it's semi-important. Feels like every other standup has a spiel from the EM about "we need to test things, stop breaking things, etc.".
Whenever I argue on their terms though, I quickly "lose", because business terms seem to be, "agree to everything from the business, look busy, and we will have time for IT concerns (i.e., testing) when we are done with business projects for the year (i.e., never)."
If I want any meaningful change, I think it will need to be be something I work around management on.
We've definitely written lots of tests that felt like net negative, and I think that's part of what burned some devs out on testing. When I joined, the few tests we had were "read a huge JSON file, run it through everything, and assert seemingly random numbers match." Not random, but the logic was so complex that the only sane way to update the tests when code changed was to rerun and copy the new output. (I suppose this is pretty similar to approval testing, which I do find useful for code areas that shouldn't change frequently.)
Similar issue with integration tests mocking huge network requests. Either you assert the request body matches an expected one, and need to update that whenever the signature changes (fairly common). Or you ignore the body, but that feels much less useful of a test.
I agree unit tests are hard to mess up, which is why I mostly gravitate to them. And TDD is fun when I actually do it properly.