Davis Treybig

Thinking in public about topics I find interesting in computing infrastructure.

The hottest Substack posts of Davis Treybig

And their main takeaways
19 implied HN points 24 Jul 23
  1. The driving factor limiting context window size is the quadratic scaling of self-attention in transformers.
  2. New research explores alternative mechanisms like Hyena Operators, State Space Models, and hierarchical attention to improve context window efficiency.
  3. Emphasis is placed on the importance of context curation and retrieval systems over simply increasing context window size for effective LLM performance.
19 implied HN points 15 Apr 23
  1. Large language models (LLMs) are being used in security for tasks like logs analysis and incident response.
  2. LLMs are changing the landscape of traditional static analysis tools in cloud and application security.
  3. LLMs have the potential to automate processes like vendor security questionnaires and enhance engineer-oriented security workflows.