Also from their site:
TensorFlow isn't a rigid neural networks library. If you can express your computation as a data flow graph, you can use TensorFlow. You construct the graph, and you write the inner loop that drives computation. We provide helpful tools to assemble subgraphs common in neural networks, but users can write their own higher-level libraries on top of TensorFlow. Defining handy new compositions of operators is as easy as writing a Python function and costs you nothing in performance. And if you don't see the low-level data operator you need, write a bit of C++ to add a new one.
That almost gets me to download it, but given my programming skillz, I’d most likely succeed in getting the internet-enabled toaster to team up with the xbox to have me killed.
Great! Thanks to the IoT, everything will be telling me that I suck.
It’s neat, but it’s not 5 years ahead of everything else. I’ve played around with it, Theano + keras provides most of the same functionality, as do Torch and Neon with the standard add-ons. I really do like that it’s in python rather than lua and that’s probably why I’ll switch. I don’t particularly like the build system. The documentation is much more glossy than the open source stuff, but google has a history of documentation bitrot. The stuff they haven’t yet released (distributed training, datasets) is more significant, but may be matched by everyone else soon enough. Not sure why they are releasing and PRing this right now, but I’d be suspicious that it’s purely altruistic. Perhaps someone else can think of reasons why?
I would assume there’s news (intentional or leaked) about some (potentially scary) advance they’ve made with the Boston Robotics IP.
This topic was automatically closed after 5 days. New replies are no longer allowed.