21 Comments

I began reading your post and realized "hey -- I was just scribbling about that myself" and finally remembered that the context was.

The combination of immutable data (with an inherent timestamp) and the idea of an "effective date" is pretty damned powerful ... and much more representative of reality than the good ol' "just keep a snapshot, bytes are EXPENSIVE" days of dozens and dozens of megabytes in a refrigerator-sized behemoth!!!!

Expand full comment
Oct 17, 2023·edited Oct 17, 2023Liked by Kent Beck

Hi Kent, thanks this article, love the principles you take us through to build up the argument that some complexity can be necessary. Like a rocket where every kilo of fuel is a tradeoff against the weight of it, we're faced with adding just enough complexity to solve the problem but not more. And the messy real-life complexity you describe here definitely resonates!

Reading the article, the words "event sourcing" kept ringing in my ears, and I think you're maybe not using those words because you're purposefully describing the problem from a business-point-of-view? Including the solution you propose, which fits into that same business-context, which makes your article very clear and understandable. But I wonder if you'd agree that event sourcing is one implementation that can achieve Eventual Business Consistency?

I say this, because an event would model the backdateable date and also inherently have timestamps showing when it was received, and so it naturally expresses the two timelines. And in an event sourced system, the data that needs to be reprocessed would also be available as events, which increases the likelihood of having all the necessary data for reprocessing, compared to only having access to the current state of normalized data.

More generally, I've been drawn to event sourcing architectures precisely because they reflect a fundamental reality of the world where messy, error-prone, back-filled corrections are necessary to keep everything in sync. Reality is dominated by events more than state, if I can get away with such philosophizing, which is how I feel it connects so well to what you describe.

I'd love to hear your thoughts on this, but above all else thank you for sharing the article.

Expand full comment

The appendix about the analogy remind me how I love how Pat Helland explains why convergence could be a better name (if eventual consistency wasn’t already everywhere), in "Don't Get Stuck in the "Con" Game" (https://pathelland.substack.com/p/dont-get-stuck-in-the-con-game?nthPub=281)

This makes me think: do you think that "timeline convergence" could be a better name than bi-temporality? I like the idea that _timeline_ converges (not timestamps), as the convergence car the state of the complete timeline (including past timestamps, e.g. in the error-correction "you got my address change wrong" case)

Expand full comment
Aug 8, 2023Liked by Kent Beck

Happy to see this article, Kent. I believe there are a lot of benefits here, the most notable one (customer-facing) that you illustrate: prove to the customer that you're smart enough to be aware of (in your example) changes of physical address. Although digital systems emphasize this less (standardizing addresses is critical to continue to charge recurring subscriptions), other aspects of customer history can benefit from this. Also, from a data audibility perspective, providing temporality can be invaluable for improving data hygiene. I certainly hope that these considerations become more widely known and used. I work for a DB company that offers temporality out of the box, where you can query how the data record was at a specific time in the past. This does not sacrifice performance/low latency, distributed writes, or transaction integrity - the point being: there are already database products out there that offer this robustness. Thanks again for the great article. - Luis @ fauna.com

Expand full comment
Aug 8, 2023Liked by Kent Beck

Kent, thanks for writing this up.

Even for people who understand a concept, it’s hugely valuable since we can leverage your reflections and refinements.

I like the terms effective and posted dates; this is clearer than I’ve seen it expressed before.

Expand full comment

Interesting take. Thanks Kent.

Expand full comment
Aug 5, 2023Liked by Kent Beck

Accounting systems have actually used this concept of bi-temporal data for a while, without having such a succinct name and explanation for it, and hence being difficult to grasp for non-accountants. Kudos for labeling and visualizing it so clearly!

As there is an increasing of real time reporting out of financial accounting systems, we’d need to extend this model to tri-temporal -- 1) time of event happening “in the real world”, 2) time of financial reporting period (as these need to be locked periodically), 3) time of recording in the database.

Expand full comment

Hey! Kent, great article which explains things for geeks. Thank you!. The fun thing is that in your “The Analogy” part you share the same I heard today in Software Engineering Daily podcast titled “CAP Theorem 23 Years Later with Eric Brewer”. So, what you example describes is CAP theorem: you can get either consistency or availability.

Link to podcast episode: https://softwareengineeringdaily.com/2023/07/25/cap-theorem/

Expand full comment
Aug 4, 2023·edited Aug 4, 2023

> Part of the reason it hasn’t taken off is because of the additional complexity it imposes on programmers

The trouble is that SQL and the existing crop of databases make working with time like this far harder than it ought to be. I work on https://xtdb.com where we are building a database engine in which all data is bi-temporal by default, and crucially, without imposing a tonne of schema & query language boilerplate on developers who are building applications that (currently!) only care about 'now'. I think this bi-temporal-by-default approach may be the only way the concept can succeed.

For anyone curious to hear more, this recent presentation I gave on "UPDATE Considered Harmful" may be of interest: https://www.youtube.com/watch?v=JxMz-tyicgo ...and shameless plug (sorry Kent!): I'm also giving a webinar about bi-temporality specifically next week: https://attendee.gotowebinar.com/register/2960607012900067930?source=xtdb-discuss

Expand full comment

Hmm I agree it’s a solution for certain cases, but wouldn’t it be easier here to say „a change applies as you communicate it, and if you realise too late to inform about the move then it’s your problem“?

Expand full comment