OK, now I remember why I like Ruby: reading through the code for the Reality Wikipedia/DBPedia interface
- Get link
- X
- Other Apps
I have been diving deep this year using Haskell, largely in working on examples for the Haskell tutorial and cookbook-style book I am writing. I was revisiting some of my own (old) code for using Wikipedia/DBPedia data and I ran across the very nice Reality library which is written in Ruby. Reality is so very much better than my old code and I enjoyed looking at the implementation.
Ruby and Haskell complement each other in the sense that they are in the opposite ends of programming languages spectrum. If you were forced to only use two programming languages Ruby and Haskell would be good choices. Ruby, like Clojure, has ready access to the vast Java ecosystem via JRuby so the combination of Haskell and Ruby really does cover the bases.
The ability to integrate real world data as found in Wikipedia/DBPedia into systems is a powerful idea. In building AI systems, large companies like Google, Facebook, and Microsoft preprocess and use available world knowledge (I worked for a while with the Knowledge Graph at Google, so I know their process and I assume that Microsoft and Facebook are similar), however, for small organizations and hobbyists/enthusiasts caching and indexing the world's knowledge just isn't possible but some of the same effect can be had by making live API calls to DBPedia, Wikidata, etc.
While I appreciate the work the 800 pound gorillas (Google/Microsoft/Facebook) are doing, I also hope that a rich cooperating ecosystem of small organizations continues to also claim relevance in building systems that help everyone integrate their own data / knowledge / experience with the deep knowledge that we all (hopefully) contribute to on the web.
I find myself pushing back against the "gorillas" by preferring, when feasible, to participate in community efforts. A good example is using quitter.no/markwatson). In a similar way, I hope that developers contribute to and use good open source projects that support deep knowledge management, deep learning (yeah, "deep" is probably used too often), and AI in general.
In a world where global corporate powers centralize power and control, I believe that it becomes more important for people to make personal decisions to support local businesses, care about the financial and environmental health of their local communities, and continue to use the Internet and the WWW to promote individualism and community, not globalism.
Ruby and Haskell complement each other in the sense that they are in the opposite ends of programming languages spectrum. If you were forced to only use two programming languages Ruby and Haskell would be good choices. Ruby, like Clojure, has ready access to the vast Java ecosystem via JRuby so the combination of Haskell and Ruby really does cover the bases.
The ability to integrate real world data as found in Wikipedia/DBPedia into systems is a powerful idea. In building AI systems, large companies like Google, Facebook, and Microsoft preprocess and use available world knowledge (I worked for a while with the Knowledge Graph at Google, so I know their process and I assume that Microsoft and Facebook are similar), however, for small organizations and hobbyists/enthusiasts caching and indexing the world's knowledge just isn't possible but some of the same effect can be had by making live API calls to DBPedia, Wikidata, etc.
While I appreciate the work the 800 pound gorillas (Google/Microsoft/Facebook) are doing, I also hope that a rich cooperating ecosystem of small organizations continues to also claim relevance in building systems that help everyone integrate their own data / knowledge / experience with the deep knowledge that we all (hopefully) contribute to on the web.
I find myself pushing back against the "gorillas" by preferring, when feasible, to participate in community efforts. A good example is using quitter.no/markwatson). In a similar way, I hope that developers contribute to and use good open source projects that support deep knowledge management, deep learning (yeah, "deep" is probably used too often), and AI in general.
In a world where global corporate powers centralize power and control, I believe that it becomes more important for people to make personal decisions to support local businesses, care about the financial and environmental health of their local communities, and continue to use the Internet and the WWW to promote individualism and community, not globalism.
- Get link
- X
- Other Apps
Popular posts from this blog
I am moving back to the Google platform, less excited by what Apple is offering
I have been been playing with the Apple Intelligence beta’s in iPadOS and macOS and while I like the direction Apple is heading I am getting more use from Google’s Gemini, both for general analysis of very large input contexts, as well as effective integration my content in Gmail, Google Calendar, and Google Docs. While I find the latest Pixel phone to be compelling, I will stick with Apple hardware since I don’t want to take the time to move my data and general workflow to a Pixel phone. The iPhone is the strongest lock-in that Apple has on me because of the time investment to change. The main reason I am feeling less interested in the Apple ecosystem and platform is that I believe that our present day work flows are intimately wrapped up with the effective use of LLMs, and it is crazy to limit oneself to just one or two vendors. I rely on running local models on Ollama, super fast APIs from Groq (I love Groq for running most of the better open weight models), and other APIs from Mist...
AI update: The new Deepseek-R1 reasoning language model, Bytedance's Trae IDE, and my new book
I spent a few days experimenting with Cursor last week. Bytedance's Trae IDE is very similar and is currently free to use with Claude Sonnet 3.5 and GPT-4o: https://www.trae.ai/home I would like to use Trae with my own API accounts but currently Bytedance is paying for LLM costs. I have been experimenting with the qwen2.5 and qwen2.5-coder models that easily run on my M2Pro 32G Mac. For reasoning I have been going back to using OpenAI O1 and Claude Sonnet, but after my preliminary tests with Deepseek-R1, I feel like I can do most everything now on my personal computer. I am using: ollama run deepseek-r1:32b I recently published my new book “ Ollama in Action: Building Safe, Private AI with LLMs, Function Calling and Agents ” that can be read free online at https://leanpub.com/ollama/read
Getting closer to AGI? Google's NoteBookLM and Replit's AI Coding Agent
Putting "closer to AGI?" in a blog title might border on being clickbait, but I will argue that it is not! I have mostly earned my living in the field of AI since 1982 and I argue that the existence of better AI driven products and the accelerating rate of progress in research, that we are raising the bar on what we consider AGI to be. I have had my mind blown twice in the last week: Today I took the PDF for my book "Practical Artificial Intelligence Programming With Clojure ( you can read it free online here ) and used it to create a notebook in Google's NotebookLM and asked for a generated 8 minute podcast. This experimental app created a podcast with two people discussing my book accurately and showing wonderful knowledge of technology. If you want to listen to the audio track that Google's NotebookLM created, here is a link to the WAV audio file Last week I signed up for a one year plan on Replit.com after trying the web based IDE for Haskell and Python...
Comments
Post a Comment