I can't wrap my head around what the Wikipedia article or the answer here say.
Can someone explain Rule 110 in simple terms? How does it guarantee Turing completeness?
My attempt at a succinct, layman's terms explanation:
Rule 110 is an elementary cellular automaton: a rule for transforming a finite pattern of 1's and 0's into another pattern of 1's and 0's.
When Rule 110 is iteratively applied on certain input bit sequences, patterns emerge depending on sub-sequences found in the input bits. Given enough iterations, the following can happen:
The original sub-sequence appears in the same location as in the original input.
The original sub-sequence is preserved but 'moves' to a different location in the bitfield.
Two sub-sequences moving toward each other interact and 'pass through' each other.
Two sub-sequences combine to create a new sub-sequence.
Different sub-sequences can be given symbolic meaning like '1', '0', 'clock pulse', or 'production rule' that correspond to the elements of a cyclic tag system.
With many iterations of Rule 110 on a carefully constructed input bitfield, the interaction of the sub-sequences simulates the behavior of a cyclic tag system.
A cyclic tag system can be used to simulate a universal Turing machine. Thus a cyclic tag system is Turing-complete.
Since Rule 110 can simulate a cyclic tag system, it too is Turing-complete.
What is, in your opinion, the most surprising, weird, strange or really "WTF" language feature you have encountered?
Please only one feature per answer.
I hear a lot that new programming languages are dynamically typed but what does it actually mean when we say a language is dynamically typed vs. statically typed?
I was just wondering who knows what programming languages Windows, Mac OS X and Linux are made up from and what languages are used for each part of the OS (ie: Kernel, plug-in architecture, GUI components, etc).
I assume that …