I think most frequentists would agree that Bayesian inference is more intuitive. Bayesian inference is much more computationally difficult though, and you usually get the same answer anyways. (Bayesian estimators are typically asymptotically equivalent to classical estimators!)
I don’t agree with this! In reality we don’t get asymptotic properties, we get finite sample properties, and these can vary greatly. E.g. MLE often won’t even converge for hierarchical models without a fair amount of data. Also, for bespoke models there often isn’t a published frequentist estimator available, and attempting to derive one would be a much bigger issue for most people than the computational resources required for MCMC or variational inference.
I think most frequentists would agree that Bayesian inference is more intuitive. Bayesian inference is much more computationally difficult though, and you usually get the same answer anyways. (Bayesian estimators are typically asymptotically equivalent to classical estimators!)
> and you usually get the same answer anyways
I don’t agree with this! In reality we don’t get asymptotic properties, we get finite sample properties, and these can vary greatly. E.g. MLE often won’t even converge for hierarchical models without a fair amount of data. Also, for bespoke models there often isn’t a published frequentist estimator available, and attempting to derive one would be a much bigger issue for most people than the computational resources required for MCMC or variational inference.