https://news.ycombinator.com/item?id=13713480 - Announcing the first SHA-1 collision (2017-02-23)
Ange Albertini is listed as an author in both articles.
Also, I wonder how the security of cryptosystems relates to a combined approach, i.e. if you cut the message and hash smaller parts with multiple algorithms (SHA-1+SHA-256) how much more infeasible you can make the "attack" here.
So you can have a global hash that is checked 3 times (so not much of a hamper but better than nothing, if you theorize there are 1000x faster than birthday problem attacks on each hash), to serve as a sort of "quick check", and a deeper hash when you split a 10 GB file into say, 10 MB chunks, and so gain back a 1000x order-of complexity (at a minimum) to compute.
For streams where order of bits matters, I speculate this may be even more difficult to attack, since each attacked hash is now constrained by the data that will fill each prior and next chunks, if not the whole set of chunks (so perhaps 1 million or greater difficulty?)