Originally published at: https://boingboing.net/2024/07/29/inku-a-minimalist-color-desktop-calendar.html
…
Inku connects seamlessly with your online calendars, to-do apps and streaming services,
That would be something to test. Those seamless connections are hard work!
Some of the features look to me like they are not processing the data locally.
Connect your Accounts - Calendars, Todo lists, etc. in the app.
If that’s not being processed locally and securely on the device, but at their end (with the keys to your accounts), oh hell no!
There’s a small whiff of Rabbit R1 and NFTs to this. How do they fund this, selling boxes, or service fees?
That could be simply phone forwarding, but they also have an AI-style summary that cannot be run locally on a phone, so they must be sending the content somewhere for doing that.
I have to say that I personally find the product appealing, but I’m so burned over companies ingesting my data that even a whiff or a hint of this instantly raises alarms everywhere on my brain.
Even if we ignore feeding personal data into an AI, also gotta figure a high chance of a short lifecycle before this thing is just e-waste as well since it’s no doubt reliant on remote servers to do anything. Server based devices from crowdfund startups don’t have a great track record for longevity of support.
I looks like a standard color e-ink panel, like the ones you can buy at pimoroni
Which, coincidentally is also available in 7’3 and 4 inches!
Going Pi, I wonder if the NPU hat could handle the “AI” summary part locally?
nah. The pi can run simple models like edge and shape detection but nothing that compute intensive, even with the NPU hat. Things may change with the newer, slimmer models meant to be run locally on cellphones. The NPU hat was a bit of a disappointment to me, TBH, as it is woefully underpowered for the kind of tasks we want to do it now (clearly was designed to accelerate the kind of tasks we were doing with the pi during the pandemic, like face detection, shapes, gesture, and small-model-learning -ie: recognize three or four different words-).
Especifically for the Inku, if it ran somthing like a pi it would not need pairing, and would not have a 30 day battery life. My guess is that it probably runs a low power bluetooth enabled microcontroller (something like a nordic, probably) to save costs, leaving the heavy processing to the cellphone.
If you wanted to run a model locally what you could do is make a lab server (something with a graphics card to accellerate the computing) and run the model from there. I am looking at things like this to sort my data junk
Good to know. I’ll be upgrading my Pis soon, and I was wondering if I should be looking at other SBC makes that frequently have an NPU on-board. No specific use-case yet, just to play with evaluate.
This probably would deserve a separate thread, but basically the main limitation of the PIs for running LLMs is the ram (a weak processor means you either lower the resolution, or run it slower, but if you cannot fit the model in memory, there’s nothing you can really do).
Microsoft unveiled a model that was decently capable yet was able to run in about 4GB of RAM, which means a Pi could reasonably run phi3 and the only constrain would be speed.
Still, I don’t know much about this model capabilities of ingesting new data, and that usually takes time and power.
This topic was automatically closed after 5 days. New replies are no longer allowed.