Uh, when did I say that?
You're the one who's literally ignoring a disclaimer and saying that I need to show my work for pointing out that ignoring a disclaimer doesn't prove your argument? What work am I to show here? Am I supposed to break down why ignoring a disclaimer doesn't work again? I assume you're smart enough to understand what I wrote the first time.
Let's go over it again.
I cited data that showed a downward trend in the day-to-day numbers of new cases, new hospitalizations, and new deaths in NY City. I stated there was a downward trend.
Dieter came in and stated that, no, there were 75,000 new cases, in what now appears to be a misprint (see his post upthread).
At the time I responded to him, it wasn't know that it was a misprint, so I again asserted what the numbers were saying, and also noted that they are coming from the NYC Dept. of Health, and provided links.
You came in and noted the disclaimer to cast doubt on what I was asserting (i.e. that there was a downward trend in daily positives, hospitalizations, and deaths in NYC). So that's where I brought that number in, as you seemed to be supporting Dieter and the 75k new cases that appeared to be asserted. I've conceded that the numbers can change within the space of a few hours, and I've actually seen that on that website. However, as I've noted, it's usually marginal changes to the more recent data. Data more than a day or two old has not changed on the website as long as I have been following it, so it's probably safe to assume those numbers are accurate or pretty close thereto.
As far as the rest of your post, yes, you should hear what the experts have to say. But experts, like the rest of us, are human beings and make mistakes. They are not infallible, and certainly have not been infallible in this pandemic. You should compare what they say - and especially what they project - to the actual situation that can be quantified by empirical data. In another post upthread, I compared actual hospitalizations, nationwide, with projected hospitalizations. There is a huge disparity in those numbers, with the projected numbers being several times greater than the actual numbers. When I see a disparity like that, my antenna goes up, and I begin to question the expert that made the initial projection - especially when they refuse to adjust their model to the actual data. You should too. A little skepticism, even towards the experts, can be a good, healthy thing.
I'll give you another example, completely separate from this situation. In 2002 and early 2003, all the experts in the field assured us that Iraq had large stockpiles of weapons of mass destruction. Every one of our intelligence agencies was saying that. So were the intelligence agencies of multiple other countries, including all of those with which we have the Five Eyes agreement. To question the experts at that time caused on to be branded as unpatriotic, as someone that did not take seriously the issue of national security.
So, on the premise that Iraq had massive stockpiles of WMD's, we invaded them. And what did we find out after the invasion? That Iraq did not massive stockpiles of WMDs, and, at best, had barely functional WMD programs. The difference between what the experts told us and the actual reality was stark.
Experts can be wrong too, and badly so. It doesn't mean they are always wrong, or even mostly wrong. But it does mean you should not take them at face value, and that you should put their assertions and proclamations to the test, and assess the expert's credibility accordingly. Because, as the Iraq War shows, when they are wrong, their mistakes can have huge, negative consequences.