Posts

I retired yesterday: my list of things to do in retirement

What does an intelligent person do in retirement? That is a question of individual tastes but I will share my list of 20 things. Yesterday was my last day working on a recommendation model at Babylist. Babylist is a great company to work for but I decided to retire in order to spend more time helping my wife who now has chronic health problems. When I write books, my wife enjoys editing my work so we will keep doing that. I also plan on being a gentleman computer scientist by working on open source deep learning applications and semantic web/linked data tools and applications. I may end up not doing all of these things, but generally I plan on spending more time on current interests and starting some new hobbies: Retirement Activities Join an Internet Chess club **DONE** Get a fishing license Video Games Improve my cooking/recipe web site Reading Release new editions for my 3 most popular eBooks Practice guitar, Native American Flute, and didgeridoo Eco-b...

My productivity hacks

Like most people, there are many more things that I would like to do than I have time to do. I believe that learning to identify action items that simply should not be done is valuable, but not easy to do. I am mildly attention deficit in the sense that I can only think about or attend to one thing at a time. For a computer scientist, this has been a super power, but is personally inconvenient. I keep 3 TODO lists: TODO high priority - I tend to have 1 to 3 things on this list. I time box my activities so this list is the actions that I will rotate through. I usually work in 60 to 90 minute sprints, but for deep coding this may be 3 or 4 hours. TODO open - everything that I would like to do, but a lot of stuff on this list gets deleted with no further effort (the all important decisions on what not to do). TODO done - instead of deleting completed actions on "TODO high priority" I cut the text and paste the action text to the top of this list. I really like the F...

DBPedia Natural Language Interface Using Huggingface Transformer

I prototyped a simple natural language question answering demo in about 90 minutes. I accept a query like “where does Bill Gates work?”, find the likely URI for Bill Gates, collect some comment text for this DBPedia entity, and then pass the original query to the transformer model with the “context” being the comment text collected via a SPARQL query. I run this on Google Colab. Note that I saved my Jupyter Notebook as a python file that is in the listing below. Note the use of ! to run shell commands (e.g., !pip install transformers). # -*- coding: utf-8 -*- """DbPedia QA system.ipynb Automatically generated by Colaboratory. Original file is located at      https://colab.research.google.com/drive/1FX-0eizj2vayXsqfSB2ONuJYG8BaYpGO **DBPedia Question Answering System** Copyright 2021 Mark Watson. All rights reserved. License: Apache 2 """ !pip install transformers !pip install SPARQLWrapper from transformers import pipeline qa = pipeli...

I have a new job helping to build a Knowledge Graph at Olive AI

 I retired (my last job was Master Software Engineer and the manager of a deep learning team at Capital One) a year ago April and was enjoying time with friends and family, doing personal research in hybrid AI, lots of writing, and volunteering at our local food bank. I stopped my volunteer work with COVID-19 and welcomed the opportunity last month to start work at Olive AI  working on a very strong Knowledge Graph team. I believe in their mission and the work and the people are great! It is refreshing to leave the deep learning field, at least for a while. My heart is in developing stronger AI that can explain its actions and adapt flexibly to help people in their lives. I always take a humans-first stand on technology. AI systems should help us get our work done efficiently and remove tedium, allow us more time for creative activities, and generally enjoy our own humanity.

I have tried to take advantage of extra time during the COVID-19 pandemic

My wife Carol and I have been practicing social distancing and wearing masks for shopping for over 5 months now. Welcome to the new normal and a crazy world in which entertaining and seeing friends is done by meeting in people's yards and everyone bringing their own "meal in a bag." I enjoy writing so I have been updating my recent books, starting with Loving Common Lisp, or the Savvy Programmer's Secret Weapon and  A Lisp Programmer Living in Python-Land: The Hy Programming Language . These are free to read online and licensed with Creative Commons Share and Share Alike, No Commercial Reuse, so you can also find copies on the web (hopefully up to date copies!). Last month I started a much larger project: I have not updated my book Practical Artificial Intelligence Programming With Java since the fourth edition was published in 2013. I have discarded a lot of older material like exert systems, and have three new chapters on the semantic web and also a new chapter on...

Custom built SBCL and using spaCy and TensorFlow in Common Lisp

Here are some of my of my recent notes that might save you some time, or teach you a new trick. I have had good results using the py4cl library if I wrap API calls to TensorFlow or spaCy in a short Python library that calls Python libraries and returns results in simple types like strings and dictionaries. I just committed a complete example (Python library and Common Lisp client code) to the public repo for my book  Loving Common Lisp, or the Savvy Programmer's Secret Weapon that will be added to the next edition of my book. Here is a link to the subdirectory with this new example in my repo: https://github.com/mark-watson/loving-common-lisp/tree/master/src/spacy I frequently make standalone executable programs using SBCL and I just noticed a great tip from  Zach Beane for compressing the size of standalone executables. Start with rebuilding SBCL from source to add the compression option; get the source code and: ./make.sh --with-sb-thread --with-sb-core-compression ...

Protecting oneself from surveillance capitalism

As an author I find occasional use of Facebook and Twitter to be useful for “broadcasting” notifications of my new books, open source projects, etc. I also find gmail to be useful for some types of email. Still, I do like to take a few easy steps to push back a little against the free use of my web behavioral data to profit corporations who I don’t do business with (and those I do): Use ProtonMail as my primary email Use Firefox on my Linux and macOS laptops with individual containers for Google, FaceBook, etc. On iOS devices, favor browsing with private tabs. Use a VPN when I am traveling and when I  need to use public WiFi  Limit use of my gmail address to a backup email and as a junk email address. For online purchases from Amazon, etc. use a secure email service that does not use the contents of your email to market to you and as data to sell to 3rd parties. Frequently delete all cookies from web browsers that you use. Use private browsing windows for routine us...

My hopes and predictions for the next 10 years

Hello everyone,  I wish everyone a happy and healthy new year! Here are my predictions for the next ten years: Wearable devices like the Apple Watch will become widely used and because of user pushback we will see company’s like Apple, Google, Toshiba, Huawei, Samsung, etc. start to support standards that allow, for example, an Apple Watch to interact with a Samsung TV. Further, I expect a single personal device (watch or phone) to be for most users a connection hub that interacts with public kiosks, displays, input devices, etc. Deep learning architectures and techniques will rapidly improve and will continue to rule the world, at least for a while. I expect at least one new dramatic paradigm shift for AI beyond current deep learning, reinforcement learning, etc. models. The world economies will get hit hard and wealth will be in a larger part measured in terms of ownership of water and food production, manufacturing, technology IP, and hopefully hard assets like gold, silv...

GANs and other deep learning models for cooking recipes

I retired this spring after working on artificial intelligence projects since the 1980s. Freedom from having to work on large projects for other people and companies is liberating and frees up time for thinking about new ideas. Currently I am most interested in deep learning models for generating and evaluating recipes - for now I am using a GAN model (which I am calling RecipeGAN). When I managed a deep learning team at Capital One, I used GANs to synthesize data. During a Saturday morning quiet-time hacking sprint the first month at my new job, I had the idea to take an example program SimpleGAN that generated MINST digits and instead generate numeric spreadsheet data (using the Wisconsin Cancer Data Set that I had previously used in my books as example machine learning data). I was really surprised how well this worked: I could generate fake Wisconsin cancer data, train a classification model on the fake data, and get classification prediction accuracy on real data samples that wa...

Back living in Sedona Arizona and enjoying my retirement

My wife and I returned to our home in Sedona Arizona in June. I had been managing a deep learning team for Capital One in Champaign Illinois (in the research park at UIUC). I am now retired so we moved back into our house in the mountains in Central Arizona. re: retirement: while I will might still do small interesting consulting jobs, I am retired. I am spending my time volunteering at a local food bank, hiking and kayaking with my friends, and I joined a local writers group to give myself a shove to finish a sci-fi book I have been working on for a long time. I released a second edition to my Haskell book this week and I have edits for a new edition for my Common Lisp book that I will push to current readers soon, but I plan on no longer writing new technical books. I have written 22 technical books - probably sufficient :-) Personally my passion is still studying artificial intelligence and deep learning but this is now research for my personal pleasure.

My large Haskell + Python project KGcreator (tool for automating the generation of Knowledge Graphs) and auto code formatting

You might wonder what the topics of my large Haskell + Python project KGcreator and auto code formatting have to do with each other. I addition to working on two Python books ( Python Intelligent Systems and Deep Learning and Graph Databases ), my main 'retirement' activity has been write a lot of Haskell code and a smaller amount of Python code for my KGcreator project. After reading a discussion on Hacker News yesterday about Python code tidy/auto-format tools, I decided to add Makefile targets. After a 'stack install stylish-haskell hindent' and a 'pip install yapf', I added something like this to my Haskell top level Makefile: tidy: cd src/fileutils; stylish-haskell -i *.hs; hindent *.hs cd src/nlp; stylish-haskell -i *.hs; hindent *.hs cd src/sw; stylish-haskell -i *.hs; hindent *.hs cd src/webclients; stylish-haskell -i *.hs; hindent *.hs cd test; stylish-haskell -i *.hs; hindent *.hs And something like this to my Python top level Mak...

I retired from Capital One yesterday

With deep gratitude for a great company and a great job, I retired from my role as manager of the UIUC machine learning team and Master Software Engineer. Capital One has deep machine learning talent so check them out if you are looking for ML work. Thanks especially to my team for being interesting to work with and for the kind going away gift of locally made Go Ban board, bowls, and Go stones. A wonderful gift. I will miss you all! When my family and friends hear me talk about retirement they do so with great skepticism since I have retired several times already! That said, I feel like kicking back and finishing my current book project and perhaps do limited consulting work after I travel a bit to see family and friends.