I speak only for myself, not my employers
Ultimately, a hardware/software construct which works in concert with the brain to extend its capabilities.
Whatever you design it to do.
Input
Output
1 RSSAgent:
2
3 {
4 "expected_update_period_in_days": "5",
5 "clean": "false",
6 "url": "https://www.2600.com/rss.xml"
7 }
8
9 E-mailAgent:
10
11 {
12 "subject": "{{ title }}",
13 "headline": "Published on {{ date_published }}",
14 "body": "{{ description }}{% line_break %}{% line_break %}
15 {{ content }}",
16 "expected_receive_period_in_days": "365"
17 }
...and the article is e-mailed automatically.
(Edited for size)
1 {
2 "date_published": "2016-04-10 20:50:00 +0000",
3 "last_updated": "Sun, 10 Apr 2016 20:50:00 +0000",
4 "url": "http://www.voanews.com/content/fierce-campaigning-ahead-...",
5 "description": "The next U.S. presidential primary contest is...",
6 "content": "The next U.S. presidential primary contest is more...",
7 "title": "Fierce Campaigning Ahead of New York Presidential Primary",
8 "authors": [
9 "webdesk@voanews.com (Michael Bowman)"
10 ],
11 "categories": [
12 "USA",
13 "2016 USA Votes"
14 ]
15 }
It didn't start out as a unified project or even a personal tool.
It was my "I want to learn about $foo" project.
It was also a manifestation of "Hey, I'm not on dialup anymore!" syndrome.
This meant that I had constant access to all of the information sources available at the time... and never enough time to keep up with all of it, let alone figure out what was useful and what wasn't.
So I started writing bots that would poll those information sources for updates, figure out what was in them, sift out the useful stuff and send me digests of what they found.
(Remember: RSS wasn't invented until 1999!)
Years went by and maintaining my code wasn't actually possible anymore.
I started writing a framework to re-implement them with when one of my co-workers sent me a link to Huginn, which already did way more than my existing software.
I ported my existing bots to Huginn in an afternoon.
Later I started writing my own bots because I started running into Huginn's functional limits.
(Please, bandwidth gods, be with me!)
I spend less time at work reading security briefs, websites, and Twitter feeds to keep on top of new vulnerabilities, attacks, and data breaches. I get more work done by applying that information.
When doing research I farm the tasks of executing searches against multiple search engines out to bots so I can spend more time digesting the information and writing. My personal search engines let me search my archives and notes more efficiently.
I find out about new and interesting stuff without having to spend hours every day browsing dozens of sites. I get e-mail digests with links that I scan whenver I have a free moment. I can also filter out stuff that's not interesting.
I have a framework for experimenting with new stuff in a practical way, like machine learning and neural networks.
I get to spend more time
The Doctor [412/724/301/703/415]
E-mail: drwho at virtadpt dot net
PGP: 0x807B17C1 / 7960 1CDC 85C9 0B63 8D9F DD89 3BD8 FF2B 807B 17C1
Web: https://drwho.virtadpt.net
Keybase: https://keybase.io/drwho
Twitter: @virtadpt
Github: virtadpt
Public profile: about.me
Table of Contents | t |
---|---|
Exposé | ESC |
Full screen slides | e |
Presenter View | p |
Source Files | s |
Slide Numbers | n |
Toggle screen blanking | b |
Show/hide slide context | c |
Notes | 2 |
Help | h |