Distill was a new take at publishing research/ideas in deep learning in a visual way: https://distill.pub/ I love their articles and while it was hard to sustain, the quality of the ones in their are pretty good. They provide some tips and templates on how to develop such visual storytelling articles. - Source: Hacker News / 7 months ago
Explainable AI is far from early stages. Read into anthropic ai’s work in mechanistic interpretability like toy models of superposition along with the rest of the transformer-circuits papers. Read chris olah’s distill papers. Read neel nanda’s recent work on reverse engineering how language models grok modular addition. Read kevin meng’s work on locating and editing facts inside of gpt. Read openai’s paper on... Source: 11 months ago
I also wasn't aware of either The Pudding or distill.pub. So thanks for just mentioning those. Source: about 1 year ago
Anything from Setosa [0] is really good. It contains interactive, animated illustrations of several Machine Learning ideas. I _loved_ reading papers from Distill Pub [1] as they contained interactive diagrams. My most favorite one so far is the thread on Differentiable Self-organizing Systems [2]. I liked the lizard example very much as it is interactive, and lizards grow lost organs back. I think this is funny.... - Source: Hacker News / over 1 year ago
If you include deep learning in CS then https://distill.pub/ has a lot to offer in this category. - Source: Hacker News / over 1 year ago
Not sure if this counts as a blog, but I really liked Distill: https://distill.pub/. - Source: Hacker News / over 1 year ago
Love how things like this, Papers With Code, and distill.pub, academic papers are letting us finally evolve past paper :). Source: over 1 year ago
Your talents may be better employed at tasks that require deeper thought and rigor, like interpretability: https://distill.pub/. Source: over 1 year ago
Distill although they quit last year - https://distill.pub/. Source: over 1 year ago
However, there are plenty of blog posts that are already cited in the ML community and that address these points: - Independent blog posts. Some of them have been cited numerous times according to [google scholar](https://scholar.google.com/scholar?cluster=2043775592782102444) and are very transparent about their updates. (Though the GitHub repo of the blog post is not public, which would address the consistency... Source: over 1 year ago
Articles in https://distill.pub/ explore relevant topics. Maybe some of the topics might be useful to you. Source: almost 2 years ago
I would argue that math, proofs and other technical details should never disappear. In that sense, papers should never disappear. But it might be interesting to see what website proceedings could look like (probably something like Distill). Source: almost 2 years ago
This is cool. I’m wondering what the responsive part is — the example seems like it’s just latex2html output basically? I think it would be super cool to have more https://distill.pub style papers, with like interactive plots instead of static image files and stuff. - Source: Hacker News / about 2 years ago
Interpretability research on distill.pub and some of Anthorpic's recent papers. Source: about 2 years ago
I hadn't heard about Works in Progress before, but looks like incredibly well researched content. Pretty stark difference between the content here and what you see on something like sfgate.com or cnn.com. Reminds me of another incredible resource for deep learning articles: https://distill.pub/, which unfortunately looks like is going silent. (Also I find it pretty awesome that the Stripe is successful enough to... - Source: Hacker News / about 2 years ago
But to your original question, it depends on your intended audience. Quality blogs are great way to gain traction. Andrej Karpathy gained significant attention with his early blogposts and today folks like Jay Alammar have great blogs as well. It's shame distill.pub has slowed down, their format and content was great. If you are going to do a blog post, stay away the 50 million identical blog posts that are like... Source: over 2 years ago
I’m looking for collaborators for re-implementing “modern” machine learning and deep learning models/papers. Modern is in quotes as I’d actually like to focus less on the super recent, and more on those around ~5 years old, as the compute required is usually more feasible. As well as the implementation (which will be open sourced, well written and documented), I’d also like https://distill.pub/ style articles to... - Source: Hacker News / over 2 years ago
Excellent post! Thank you for sharing. It resonated closely with me. I too am all too familiar with intuitively good ideas failing catastrophically. I think Christopher Olah's blog and Distill were designed around the same points that you emphasize. Such visual understanding of the models help empathize with the model and help build meaningful metaphors. It addresses another critical element in empathizing with... Source: over 2 years ago
Have you looked at the Distill journal's circuit thread and their other papers before? They do some biological comparison stuff with machine learning neurons, and it might be somewhat related? Source: over 2 years ago
Colah's more recent stuff is on https://distill.pub/ but might not include much on manifolds as it's a niche field that most readers won't engage with. Source: over 2 years ago
There is of course https://distill.pub/. Source: over 2 years ago
Do you know an article comparing Distill to other products?
Suggest a link to a post with product alternatives.
This is an informative page about Distill. You can review and discuss the product here. The primary details have not been verified within the last quarter, and they might be outdated. If you think we are missing something, please use the means on this page to comment or suggest changes. All reviews and comments are highly encouranged and appreciated as they help everyone in the community to make an informed choice. Please always be kind and objective when evaluating a product and sharing your opinion.