As I recall, if you have constant-time random access array with mutation then you can emulate that behavior in a functional language using balanced trees, with O(log(n)) time.
This means any functional programming language can implement any imperative programming language with a factor of O(log(n)) overhead.
I guess as soon as you have array mutation, you have assignment and thus it’s already imperative? The trick I’m thinking of definitely didn’t involve any algorithmic complexity overhead, it was just more of a trick to emulate assignment and imperative statements using pure functions. It was a neat observation at the time, but with the passage of time it starts to seem a little less surprising, though I guess it still is a little since the article asks the question.
Maybe you're thinking of the zipper data structure? [1]
Where you represent the composite structure hanging from the point you want to mutate, so updating that point is merely dropping the header and appending the updated value as a new prefix.
This means any functional programming language can implement any imperative programming language with a factor of O(log(n)) overhead.