Appreciate you taking a moment to write up your thoughts. I'm torn between disappointment and relief that there do not seem to be any poorly made video recordings of the proceedings. I'm landing on relief and expect that my own imagination, along with some working papers and blogposts, is better than muffled voices on YouTube.
And thanks for sharing that Barzun quote. I rather like Brad DeLong's "anthology intelligence" or Shalizi’s own "collective cognition" but now I'll finally go read House of Intellect.
"Not having to think is often a good thing! Tradition lets us externalize certain processes so we can focus on other tasks. Formalities strengthen cultural connections. Traditions in communication help us understand each other better and come to consensus faster."
Absolutely!
These days I like to turn Wittgenstein's 'language being based on shared experiences' into 'culture being based on shared convictions'. These shared convictions are important to minimise friction, transaction costs, and they increase speed. The aspect of convictions (beliefs, assumptions, etc.) is an essential perspective, I think.
Human intellect is for as large part automated, in the form of our convictions and beliefs that we can use efficiently and quickly (most of your judgements on what people tell you are instantaneous, for instance). This makes us both energy-efficient and fast, something evolution has selected for.
Even at a tribe level, having shared convictions is essential. If you have to cooperate, the people you cooperate with need to be predictable. I.e. if your fellow tribe member must be predictable or the common hunt comes up empty. Trust (assumptions about future actions) is thus essential.
Both forms of 'mental automation' — at the individual level and at the tribe level — are necessarily stable, i.e. hard to change. That stability is a conditio sine qua non for efficiency and speed.
Humans have expanded the use of 'shared convictions' from the tribe size to enormous sizes, often based on stabilising institutions (religion, laws, etc.) but the larger the group the more brittle this setup is. We have evolved for tribes of 150-200 members (Dunbar's number) but have expanded this to brittle tribes of hundreds of millions. And as information is the glue, the information revolution and its siblings information misuse, warfare, etc. create havoc in the human world.
As intelligence is wordless and built in parallel oscillation, there is no intellect in text, nor in the total amount of text, just as "language is not the sum of all possible expressions in language" Halliday, there is no intellect to any form of AI. This idea that AI is cultural is dangerous as code is more UFOlogy than either science or engineering. Code does not adhere to any bio or physical regularities that would force it to limits (like airflow, gravity etc), so essentially it's a magic that creates outcomes by switches.
AI is more evident of a disproof of language as any idealization of intellect. Arbitrary code disproving arbitrary metaphors.
See Dario Amodei: "“Meaning isn’t an engineering problem. I don’t feel like I have the answer.”
Appreciate you taking a moment to write up your thoughts. I'm torn between disappointment and relief that there do not seem to be any poorly made video recordings of the proceedings. I'm landing on relief and expect that my own imagination, along with some working papers and blogposts, is better than muffled voices on YouTube.
And thanks for sharing that Barzun quote. I rather like Brad DeLong's "anthology intelligence" or Shalizi’s own "collective cognition" but now I'll finally go read House of Intellect.
The organizers tell me the videos will be on YouTube soon. Stay tuned!
I'm already disappointed that I can't watch them right now.
"Not having to think is often a good thing! Tradition lets us externalize certain processes so we can focus on other tasks. Formalities strengthen cultural connections. Traditions in communication help us understand each other better and come to consensus faster."
Absolutely!
These days I like to turn Wittgenstein's 'language being based on shared experiences' into 'culture being based on shared convictions'. These shared convictions are important to minimise friction, transaction costs, and they increase speed. The aspect of convictions (beliefs, assumptions, etc.) is an essential perspective, I think.
Human intellect is for as large part automated, in the form of our convictions and beliefs that we can use efficiently and quickly (most of your judgements on what people tell you are instantaneous, for instance). This makes us both energy-efficient and fast, something evolution has selected for.
Even at a tribe level, having shared convictions is essential. If you have to cooperate, the people you cooperate with need to be predictable. I.e. if your fellow tribe member must be predictable or the common hunt comes up empty. Trust (assumptions about future actions) is thus essential.
Both forms of 'mental automation' — at the individual level and at the tribe level — are necessarily stable, i.e. hard to change. That stability is a conditio sine qua non for efficiency and speed.
Humans have expanded the use of 'shared convictions' from the tribe size to enormous sizes, often based on stabilising institutions (religion, laws, etc.) but the larger the group the more brittle this setup is. We have evolved for tribes of 150-200 members (Dunbar's number) but have expanded this to brittle tribes of hundreds of millions. And as information is the glue, the information revolution and its siblings information misuse, warfare, etc. create havoc in the human world.
Nothing new under the sun
As intelligence is wordless and built in parallel oscillation, there is no intellect in text, nor in the total amount of text, just as "language is not the sum of all possible expressions in language" Halliday, there is no intellect to any form of AI. This idea that AI is cultural is dangerous as code is more UFOlogy than either science or engineering. Code does not adhere to any bio or physical regularities that would force it to limits (like airflow, gravity etc), so essentially it's a magic that creates outcomes by switches.
AI is more evident of a disproof of language as any idealization of intellect. Arbitrary code disproving arbitrary metaphors.
See Dario Amodei: "“Meaning isn’t an engineering problem. I don’t feel like I have the answer.”
He's admitting the fallacy of code.