US court fully legalized website scraping and technically prohibited it
Coverage of decision which took part of the LinkedIn vs hiQ Labs. The appeals court also upheld a lower court ruling that prohibits LinkedIn from interfering with hiQ’s web scraping of its site. This fundamentally changes the balance of power in dealing with such cases in the future.
The dark side of .io: How the U.K. is making web domain profits from a shady Cold War land deal
Came to this (older) piece via recent decision by
draw.io
to rename todiagrams.net
a they are not happy with the sate of.io
TLD.
Software Development
Porting a JavaScript App to WebAssembly with Rust
3 part series mapping an exercise of rewriting React+Redux application written in JavaScript to WebAssembly (WASM) with Rust.
-
Super short article about stepping out from
Ecto.Schema
closer to raw SQL withEcto
. Neath trick is to useMap
in yourselect:
statement, soEcto
will return (list of)Map
instead of a list of lists. Cheap tricks for high-performance Rust
Pascal shares some of the simple tricks to speed up your Rust programs without really changing the source. Hints like properly setting your target architecture, alternative allocator, release profiles and more.
When Bloom filters don’t bloom
Marek needed to deduplicate large list of IP addresses, so he set sail on the journey of getting better then
sort | unique
. He shares some lessons learned about random memory access latency, power of cache friendly data structures and Bloom Filters and finally “just” hash table.Moving a method from struct impl to trait causes performance degradation
On very similar note as a previous one – code alignment having a significant impact on the performance.
Starter project for Flutter plugins willing to access native and synchronous rust code using FFI
Flutter meets Rust, wow.
psql
-
I never really give a recursion though in the context of SQL…
-
pgtune
on web. Simple site to give youpsql
configuration to start with for different use-cases and server configuration.
Machine Learning
-
List of various papers related to BERT (Bidirectional Encoder Representations from Transformers). BERT was released by Google as part of their NLP research. But researchers are stepping forward and you can find multi-modal applications as well.
Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer
New publication by Google at the NLP space. They have published a new model called Text-To-Text Transfer Transformer (T5). They have also open-sourced a new pre-training dataset, called the Colossal Clean Crawled Corpus (C4).
Transformers are Graph Neural Networks
Drawing parallels between Transformers (key component of BERT) and Graph Networks.