Get the latest tech news

Show HN: a Rust lib to trigger actions based on your screen activity (with LLMs)


Turn your screen into actions (using LLMs). Inspired by adept.ai, rewind.ai, Apple Shortcut. Rust + WASM. - louis030195/screen-pipe

Here's an example of server-side code written in TypeScript that takes the streamed data from ScreenPipe and uses a Large Language Model like OpenAI's to process text and images for analyzing sales conversations: AI will soon be able to incorporate the context of an entire human life into its 'prompt', and the technologies that enable this kind of personalisation should be available to all developers to accelerate access to the next stage of our evolution. We discuss how to bring this lib to production, help each other with contributions, personal projects or just hang out ☕.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLMs

LLMs

Photo of actions

actions

Photo of screen activity

screen activity

Related news:

News photo

EasyTranslate thinks augmenting LLMs with humans will give it an edge over pure AI translation services

News photo

Researchers Upend AI Status Quo By Eliminating Matrix Multiplication In LLMs

News photo

Researchers upend AI status quo by eliminating matrix multiplication in LLMs