Posts

Showing posts from 2021

My productivity hacks

Like most people, there are many more things that I would like to do than I have time to do. I believe that learning to identify action items that simply should not be done is valuable, but not easy to do. I am mildly attention deficit in the sense that I can only think about or attend to one thing at a time. For a computer scientist, this has been a super power, but is personally inconvenient. I keep 3 TODO lists: TODO high priority - I tend to have 1 to 3 things on this list. I time box my activities so this list is the actions that I will rotate through. I usually work in 60 to 90 minute sprints, but for deep coding this may be 3 or 4 hours. TODO open - everything that I would like to do, but a lot of stuff on this list gets deleted with no further effort (the all important decisions on what not to do). TODO done - instead of deleting completed actions on "TODO high priority" I cut the text and paste the action text to the top of this list. I really like the F

DBPedia Natural Language Interface Using Huggingface Transformer

I prototyped a simple natural language question answering demo in about 90 minutes. I accept a query like “where does Bill Gates work?”, find the likely URI for Bill Gates, collect some comment text for this DBPedia entity, and then pass the original query to the transformer model with the “context” being the comment text collected via a SPARQL query. I run this on Google Colab. Note that I saved my Jupyter Notebook as a python file that is in the listing below. Note the use of ! to run shell commands (e.g., !pip install transformers). # -*- coding: utf-8 -*- """DbPedia QA system.ipynb Automatically generated by Colaboratory. Original file is located at      https://colab.research.google.com/drive/1FX-0eizj2vayXsqfSB2ONuJYG8BaYpGO **DBPedia Question Answering System** Copyright 2021 Mark Watson. All rights reserved. License: Apache 2 """ !pip install transformers !pip install SPARQLWrapper from transformers import pipeline qa = pipeli