Not a real thing, but an often-deployed label (marketing term?) so let's get into it.
Two main camps today (and historically?): Symbolic and statistical. (Meng Weng Wong describes these as Apollonian and Dionysian). Symbolic AI people gravitate towards e.g. logic programming and explicit ontology (sometimes called “Good Old-Fashioned AI” (GOFAI)). Statistical AI roughly= Machine Learning.
AI is a strange sort of cultural object in a lot of ways… How is it that LP and ML, two technology worlds that have almost nothing to do with each other, fall under this same label? Seems more to do with… how the makers of the technology would like users to relate to it? Rather than being a "tool", the software has its own identity and intentionality, it does the "thinking" for you (as opposed to the "routine", "mechanical" stuff we expect from a computer). Compare to e.g. a spreadsheet, which (1) is completely transparent and (2) "only" does "math".
ARC is an “intelligence” benchmark1
Why human agency is still central to Israel’s AI-powered warfare
Deep Learning Is Hitting a Wall: Advocates for neuro-symbolic AI, reconciling symbol manipulation and statistical methods (see e,g, LLM -> PDDL2)
A Union Busting Chatbot? Eating Disorders Nonprofit Puts the 'AI' in Retaliation | Labor Notes
Why transformative artificial intelligence is really, really hard to achieve
François Chollet, “On the Measure of Intelligence,” November 25, 2019, https://doi.org/10.48550/arXiv.1911.01547.
Lin Guan et al., “Leveraging Pre-trained Large Language Models to Construct and Utilize World Models for Model-based Task Planning,” November 1, 2023, https://doi.org/10.48550/arXiv.2305.14909.