Software Alternatives & Reviews

Distill Reviews and details

Screenshots and images

  • Distill Landing page
    Landing page //
    2021-09-26

Badges

Promote Distill. You can add any of these badges on your website.
SaaSHub badge
Show embed code

Videos

How To Monitor Website Changes? - Distill.io Chrome Extension

What does DISTILLED COFFEE taste like??? | Will It Distill?

DISTILLING DAY (PART 1)

Social recommendations and mentions

We have tracked the following product recommendations or mentions on various public social media platforms and blogs. They can help you see what people think about Distill and what they use it for.
  • How Transformers Work
    Distill was a new take at publishing research/ideas in deep learning in a visual way: https://distill.pub/ I love their articles and while it was hard to sustain, the quality of the ones in their are pretty good. They provide some tips and templates on how to develop such visual storytelling articles. - Source: Hacker News / 7 months ago
  • Reverse Engineer Hidden Algorithms
    Explainable AI is far from early stages. Read into anthropic ai’s work in mechanistic interpretability like toy models of superposition along with the rest of the transformer-circuits papers. Read chris olah’s distill papers. Read neel nanda’s recent work on reverse engineering how language models grok modular addition. Read kevin meng’s work on locating and editing facts inside of gpt. Read openai’s paper on... Source: 11 months ago
  • Sharing a side project: Linear Algebra for Programmers
    I also wasn't aware of either The Pudding or distill.pub. So thanks for just mentioning those. Source: about 1 year ago
  • Ask HN: What's your favorite illustration in Computer Science?
    Anything from Setosa [0] is really good. It contains interactive, animated illustrations of several Machine Learning ideas. I _loved_ reading papers from Distill Pub [1] as they contained interactive diagrams. My most favorite one so far is the thread on Differentiable Self-organizing Systems [2]. I liked the lizard example very much as it is interactive, and lizards grow lost organs back. I think this is funny.... - Source: Hacker News / over 1 year ago
  • Ask HN: What's your favorite illustration in Computer Science?
    If you include deep learning in CS then https://distill.pub/ has a lot to offer in this category. - Source: Hacker News / over 1 year ago
  • Ask HN: What are some blog posts that you have enjoyed going through?
    Not sure if this counts as a blog, but I really liked Distill: https://distill.pub/. - Source: Hacker News / over 1 year ago
  • [N][R] Hugging Face Machine Learning Demos now accessible through arXiv
    Love how things like this, Papers With Code, and distill.pub, academic papers are letting us finally evolve past paper :). Source: over 1 year ago
  • I'm great at math, but terrible at machine learning
    Your talents may be better employed at tasks that require deeper thought and rigor, like interpretability: https://distill.pub/. Source: over 1 year ago
  • [D] Accurate blogs on machine learning?
    Distill although they quit last year - https://distill.pub/. Source: over 1 year ago
  • [D] Citing blog posts
    However, there are plenty of blog posts that are already cited in the ML community and that address these points: - Independent blog posts. Some of them have been cited numerous times according to [google scholar](https://scholar.google.com/scholar?cluster=2043775592782102444) and are very transparent about their updates. (Though the GitHub repo of the blog post is not public, which would address the consistency... Source: over 1 year ago
  • Is there literature about how different neural network layers recognise certain features?
    Articles in https://distill.pub/ explore relevant topics. Maybe some of the topics might be useful to you. Source: almost 2 years ago
  • [D] ICLR 2022 blog post track
    I would argue that math, proofs and other technical details should never disappear. In that sense, papers should never disappear. But it might be interesting to see what website proceedings could look like (probably something like Distill). Source: almost 2 years ago
  • Articles from arXiv.org as responsive HTML5 web pages
    This is cool. I’m wondering what the responsive part is — the example seems like it’s just latex2html output basically? I think it would be super cool to have more https://distill.pub style papers, with like interactive plots instead of static image files and stuff. - Source: Hacker News / about 2 years ago
  • [Discussion] What are some papers you read which helped you build an intuition of how neural networks function?
    Interpretability research on distill.pub and some of Anthorpic's recent papers. Source: about 2 years ago
  • Works in Progress has joined Stripe
    I hadn't heard about Works in Progress before, but looks like incredibly well researched content. Pretty stark difference between the content here and what you see on something like sfgate.com or cnn.com. Reminds me of another incredible resource for deep learning articles: https://distill.pub/, which unfortunately looks like is going silent. (Also I find it pretty awesome that the Stripe is successful enough to... - Source: Hacker News / about 2 years ago
  • [D] How to do self-research?
    But to your original question, it depends on your intended audience. Quality blogs are great way to gain traction. Andrej Karpathy gained significant attention with his early blogposts and today folks like Jay Alammar have great blogs as well. It's shame distill.pub has slowed down, their format and content was great. If you are going to do a blog post, stay away the 50 million identical blog posts that are like... Source: over 2 years ago
  • Ask HN: Who Wants to Collaborate?
    I’m looking for collaborators for re-implementing “modern” machine learning and deep learning models/papers. Modern is in quotes as I’d actually like to focus less on the super recent, and more on those around ~5 years old, as the compute required is usually more feasible. As well as the implementation (which will be open sourced, well written and documented), I’d also like https://distill.pub/ style articles to... - Source: Hacker News / over 2 years ago
  • [D] Why Do "Good" Research Ideas Fail?
    Excellent post! Thank you for sharing. It resonated closely with me. I too am all too familiar with intuitively good ideas failing catastrophically. I think Christopher Olah's blog and Distill were designed around the same points that you emphasize. Such visual understanding of the models help empathize with the model and help build meaningful metaphors. It addresses another critical element in empathizing with... Source: over 2 years ago
  • [R] I have been working on a learning/organizing rule of biological neurons for the past 2 years, and I am wondering whatever something similar was already discovered and/or is it worth trying to get published
    Have you looked at the Distill journal's circuit thread and their other papers before? They do some biological comparison stuff with machine learning neurons, and it might be somewhat related? Source: over 2 years ago
  • Neural Networks, Manifolds and Topology
    Colah's more recent stuff is on https://distill.pub/ but might not include much on manifolds as it's a niche field that most readers won't engage with. Source: over 2 years ago
  • [D] A list of Visual Intros to DL Topics 🌌
    There is of course https://distill.pub/. Source: over 2 years ago

Do you know an article comparing Distill to other products?
Suggest a link to a post with product alternatives.

Suggest an article

Generic Distill discussion

Log in or Post with

This is an informative page about Distill. You can review and discuss the product here. The primary details have not been verified within the last quarter, and they might be outdated. If you think we are missing something, please use the means on this page to comment or suggest changes. All reviews and comments are highly encouranged and appreciated as they help everyone in the community to make an informed choice. Please always be kind and objective when evaluating a product and sharing your opinion.