I have been conducting research since 1997. I believe in qualitative approach since it is the best approach for my research. I need the ability to study and reveal richness, depths, and context of a phenomenon being studied. I rarely touch statistics or econometric (and other metric for that matter).
I don’t say that statistic is bad, on the contrary, I believe statistic is one of the best products of our civilization. In many cases, I would refer to the statistics gathered, compiled, and analyzed by somebody else.
What I could not stand is the abusive behaviour toward statistics. On my early years as lecturer, I had one research training from my employer. One of my fellow young lecturer (he was passed away long time ago, God bless his soul!) was saying that statistics is like bikini. It looks good but does not reveal everything.
What I see recently is many research has been using a myriad of statistical tools such as PLS, SEM, etc., and yet it was not appropriate (IMHO). Once I saw a paper with 13 hypotheses and all of them were rejected. I might not be an expert, but there must be something wrong here. Another example, long time ago I was examining a thesis. It was an undergraduate thesis. The students presented a result that the factors she studied could explain almost 40% of the stock price changes. I asked what was the deal with 60%? Why I should rely on her study for investment decision? She could not answer my question and eventually failed the exam.
It might be my bias as a qualitative adopter, but I still convince that you need to look at the research problem as a whole, what you do want to achieve, the data you have then and only then you select the tools to analyse the data. My point is before you select any statistical tools get a grip of the concept/theory which your whole research lay upon. A friend of mind often shows that a simple scatter plot of financial data is capable showing the meaning before the thorough analysis using statistical data.