Open Source Contributions in 2020

When I wrote my 2019 open source contributions annual review I had high hopes for my open source contributions in 2020. As I wrote in my 2020 health annual review I allowed the political upheaval in my home country, the US, to distract me way too much. Sure there was some COVID distraction in March/April but if anything I was actually hoping the lack of travel would give me time to focus more on code generation. It was not to be. That excuse aside, I still managed to put in 698 hours into open source projects. That’s a slight uptick from 2019’s 653 hours but short of the 1000 hours I was hoping to contribute. The distribution looks very different as well, with most of it concentrated around my work with The B612 foundation. The five projects I contributed to the most fall into a relatively broad range of software (from highest to lowest number of hours contributed):

My Renewed Resolve to #DeleteFacebook Thanks to Steven Levy

I was having a moment of weakness in my quest to permanently get off Facebook. The last two weeks I’ve missed some major things that were happening to some friends and family. Yes, I ultimately learned about it but I learned about it because my spouse is still on Facebook and he made a point of letting me know. At the same time I was still feeling constricted in my ability to discuss things with friends. I had been texting friends through various systems: LinkedIn, SMS, Matrix, iMessage, Twitter DMs, etc. and it occurred to me that all of this would have been in one singular place (Facebook Messenger) before now. So what was the point? Why not, I rationalized, rejoin and figure out how to leverage manual cross posting or some other mechanism to help extract friends from Facebook? Then it struck me, the next book up on my reading list was Steven Levy’s Facebook: The Inside Story. As I wrote on social media perhaps that book would push me one way or the other. Boy did it ever! The book is a great exploration of the entire history of Facebook right up to the present day. While I had mixed emotions earlier in the book as the story of Facebook went on so too did my revulsion to the idea of every going back.

First Post Diaspora API Against Real Server

I’ve spent today dusting off my old code Diaspora API driven blog comment system. The details of that implementation can be found in this blog post from late-2018. Now that it is running on a production server thanks to Diaspora-Fr I have revived the code running on the server and pointed it to their Diaspora server. I never coded up a full handshake for the initial authentication steps so that is all manual unfortunately, but I believe it is now up and running. The way I coded the server it can only point to one host at a time but since this is proof of concept right now that’s legit. For the time being I’ll be linking against that Diaspora server for comments on threads. You can comment from any Diaspora server though, not just that one. If you don’t have a Diaspora login then you can simply read the comments.

Open Source Contributions in 2019

I’m pretty stoked about what I was able to do in 2019 towards open source software. I’ve always contributed here and there but I took the momentum of contributions I did in the second half of 2018, in that case to the Diaspora project, and just kept on trucking. I spent a total of 653 hours on open source projects in 2019. A lot of that was new code generation but there is of course more to development than just writing code. There were lots of meeting times, some hackathons, documentation generation, tech support etc too. Some of these were projects I started as well as contributing to established projects. The five projects I contributed to the most fall into a relatively broad range of software (from highest to lowest number of hours contributed):

Funding Open Source Automatically

For a very long time I’ve been considering contributing to open source financially not just with code contributions. The model I was originally working on was one where I’d “buy” the free-as-in-beer open source software that I was using regularly. Looking at my software stack it’s almost all projects which don’t necessarily have huge corporate backing. Yes I use GitHub but that’s essentially Microsoft at this point. Yes I use Java but that foundation has huge corporate sponsors. Yes I use Linux which has lots of sponsors for some pieces but the projects I use are the smaller off to the side ones. So how did my “buy the software” model work? It turns out that plan sucks.

Open Source Contributions for July/August 2019

Between the months of July and August I had a month of travel. Part of that was a three week trip to southern Europe and another week at my first astrodynamics conference in a very long time (the AIAA/AAS Astrodynamics Specialists Conferenec). Because of that I actually forgot to post some of the things that I had been up to for the first half of July, before my trip. I’m therefore combining my open source contributions update to cover both months in one post. One of the biggest shifts you’ll see is my focus for the time being. While I spent much of May and June focusing on Avalonia or development using it my focus in the past couple months, and for the time being, is shifting to a project called B612 Foundation which is a non-profit organization looking to make strides in improving our ability to track near-Earth asteroids and give us enough warning time to mitigate a potential impact if one is predicted to occur. The work I’m doing on their open source astrodynamics engine and related tools is the perfect merger of my interests and technical capabilites: software engineering and aerospace engineering.

Getting Ready for Diaspora API Going Live

It’s been awhile since I’ve done development around Diaspora regularly, or anything associated with the API. When I saw the announcement of the work being done with the API at their Hackathon back in April I was pretty stoked. It looks like it will be on track for the release which hopefully will be in the near future. I was especially excited to see that there is a possibility of someone putting it up on a live-test server to work with. To get ready for that I wanted to make sure that my two code bases, the “test harness” and the “Comment Reflector” that used it to create comments for a blog as described in this blog post, worked as soon as a server went live with the code.

Open Source Contributions for June 2019

I follow a lot of open source developer blogs, including some from project-based blog aggregators like Debian, Ubuntu, and some .NET developers. One of the things they do that I like is that they provide a monthly summary of the open source contributions. In some cases I’m pretty sure it’s part of accountability for getting some funding to work on the project. In other cases I think it’s just a little historical tracking on their part. Some people make lots of changes and others just have one small package they incremented slightly. As I (hopefully) continue to ramp up my open source software contributions I want to begin that process as well. Some of these are relatively inconsequential things. Many of them are documentation-driven. All of them however I hope will help open source projects or users of open source projects in at least some infinitesimal way. Worst case I’m just publically documenting my own experiences to make for easy reference for myself as well as potentially othersThis will be the first such summary articles.

My Contribution Conundrum

I took the deep dive into the Fediverse last year when I decided to bite off the Diaspora API development task with Frank Rousseau. It was a great experience and I had hoped to do a lot more Diaspora work. With a lot of the ActivityPub discussions and there being some really good questions about how that should work I had embarked on an experiment to see what a merged Fediverse Social Media experience would feel like. Friendica has tie-ins to Diaspora, ActivityPub, and many others. It was a great candidate for it. I am way behind on doing my write up but I have my notes. That’s for another post. This post is about a conundrum I’m facing with respect to my open source/Fediverse contribution conundrum. That conundrum is: I don’t know which project(s) I want to focus on any longer.

Integrating With the Greater Fediverse

I remember the first time I had to integrate myself into a new community. It was right after college. I had started my first job which was in a new specialization of my industry. I had to come to grips with a life transition, learning how to work with a new team and new software, learning about the ins and outs of the industry around me and those interactions, et cetera. It is a very unsettling position to have orders of magnitude more things to learn than time to do it. No one expects someone to pick it all up instantly but in me there is a drive to “come up to speed” as fast as possible. When it comes to contributing to the Fediverse I am feeling the exact same thing right now.

Diaspora API Real World Usage: A Blog Discussion Timeline

“Dogfooding” software is one of the best ways to wring out any problems with a design or implementation. The Diaspora API was designed with a wide variety of uses in mind including something potentially as grand as being the replacement backend for a revamped website. With the actual API now “in the can” and waiting for the real PR review I decided to try to use the API for an actual purpose and start dogfooding it. I had several ideas but the first one I decided to latch on to was a blog discussion timeline feature.

Diaspora API Dev Progress Report 30

We’ve finally done it! Frank and I were able to get the last of our internal reviews done and the API code is now in the “real” code review for integration into the main Diaspora development branch. That alone is an amazing thing but I have a second piece of big news related to the API as well. Today I was able to stand up a first version of a blog “Discussion Browser” that uses the API to pull all comments and other interactions for a blog post that is associated with a specific Diaspora post. I’m going to be doing a write up of that in more detail later but as a first cut it worked pretty well and showed that the API design and the code itself is functioning pretty well.

Diaspora API Interactions Part 2: The First "Real" Interaction

I was so excited when I finally got a real pod interacting with the API that I knew I’d have to get it written down before I could get to sleep. However before dropping right to the interactions itself I decided to take some time describing how a piece of software would be allowed to do anything with a server. In Part1 I laid all of those details out to get across some very important points:

  • We are using a standard (OpenID/OAuth2) protocol for doing this
  • Users have to give explicit permissions to an application, including being told what it is and is not asking to do
  • There are security measures once an application is granted permissions as well.

This article essentially details the very first communications and gives people a feel for what the Diaspora API specification looks like in practice not just in theory.

Diaspora API Interactions Part 1: Authentication

Okay I’m obviously over excited about the fact that something which I knew should work actually did work. However all the previous API usages were on servers on the local machine, not behind an HTTPS link, and not being shared with the rest of the fediverse. This one breaks through that barrier. I have therefore decided to document it in excruciating detail. For the first pass all of these interactions were manual using cURL and FireFox RESTClient plugin. The next step, which will be coming up very shortly, will be creating the very first server to use this for a real purpose (I’ll document that as that happens). This document goes over the nitty gritty details of the whole authentication piece. The next article will go into the calls themselves. If you don’t care about the nuances of the authentication steps then just skim or skip this and go to the Part 2. So without further ado, here we go…

Diaspora API Dev Progress Report 29

As we begin to wrap up the year we also are beginning to wrap up the API getting ready for the “real” pull request for the API code. We are down to one last code review of the final clean up pass before we have it looked at by the core team. I think the code is pretty solid but it will of course have problems that are discovered during the review and the testing. Ah the testing, real world testing that we really need to do. To get there we need to have a test server. Thankfully that’s all taken care of now and we’ve had the first data interactions with a pod.

Diaspora API Dev Progress Report 28

Today is a momentous day in the Diaspora API development saga.  Today we have completed primary development of the API, the unit tests, and the external test harness.  There are still two code reviews between that and the real code review for integration into the main development branch, but all of the major work is complete.  What does that mean exactly?

Diaspora API Dev Progress Report 27

Boy are we really coming down the home stretch now!  All of the scopes are implemented in every API endpoint now with their corresponding tests to confirm that the permissions are working correctly.  The most difficult of those, I thought, was the Streams, again.  After beating my head against a rock a lot yesterday I put the whole project down for the day and then picked it up today.  After warming up on the other endpoints I started working my way through getting Streams working such that it could filter private data.  After a bit of fumbling I finally got a relatively simple solution to the problem and got all the tests passing correctly.

Diaspora API Dev Progress Report 26

It’s been almost a week since there’s been an update on the API.  I’ve been busy with other things and travel so it didn’t get as much focus as I would have liked to have given it.  However there has been some progress.  Thanks to Frank’s help we’ve been able to get all of the side branches merged into the core API branch so that we are now coming down the home stretch on getting it ready for integration.  The first order of business for that is getting the OpenID security stuff squared away.  I’m still working on understanding that better and the more I go back to it and work with it here the better that looks.  There is still the question of the "refresh token" workflow but work has been done on it so if anything it’s a small tweak thing or a documentation thing versus a from scratch development thing.  Even in the event that it was a from scratch thing with the code base I have and the examples I mentioned before it shouldn’t be a huge effort to get that working.  Most of the security work is therefore integrating in the much more fine grained security scopes which Senya has been working to hone.

Diaspora API Dev Progress Report 25

With the documentation changes wrapped up, but holding off on PR’s until things solidify up a bit more from the code scrub process, it was time to move on to the OpenID deep dive and review.  Up until now I’ve been working with an authorization workflow that required me to request a new token ever 24 hours and for the user to authenticate it.  I wasn’t sure how much of that was because of the flow I chose or intrinsic to how it was coded up.  As I continued to go over the OpenID documentation and other articles on the process I just couldn’t get it working.  It was then clear to me that what I needed was an example to help me.

Luckily Nov Matake created some example projects to go along with his OpenID gems, one for the OpenID Connect Provider (the server side) and one for the OpenID Relying Party (the app side).  I figured with that everything would be good to go.  After all this was the same code he had running up on Heroku but I wanted to see the nitty gritty details and set it up on both sides since I was going to need to do that with Diaspora and the test harness, or any other API use case I may be interested in.  As I had come to find out quickly these projects have never been updated.  They still rely on old versions of Ruby and Rails.  Instead of trying to downshift everything to these versions I decided to fork the projects and get them running under Ruby 2.4+ and Rails 5.  Unfortunately that derailed my entire Diaspora development effort for the day. The upside is that the community will have modern versions of these projects to use.  I intend to polish them up a little more and then issue a PR back to the original project.  My versions however can be found on my GitHub profile with the Connection Provider here and the Relying Party here.

In the process of doing these upgrades I was able to learn a lot more about porting Ruby code up from older versions.  I also got a much better understanding of some OpenID flows.  I’m going to use that to continue to move forward on the review of the implementation in the API and looking at client side implementation details.  Because of the complexity of that whole process I think that’s probably something developers can use a good amount of help for via blog posts and examples.

In Summary:

  • Documentation updates are complete but waiting for PRs for after the code scrub
  • Updated Ruby on Rails OpenID examples from Nov Matake to work under Rails 5

You can follow the status dashboard at this Google Sheet as well.

Diaspora API Dev Progress Report 24

Yesterday I said the paging API was complete but needed to be reviewed.  The more I talked over some elements with people and in exchanges on Diaspora I realized there were a couple of tweaks I needed to do.  The first suggestion I implemented was to have paging on any endpoint that returns multiple elements.  The second thing was to have a parameter for specifying the number of elements requested.  I was pleased that supporting that feature was really just two lines of code to change.  However while in there I decided to beef up some other defense programming techniques in some other places.

After that was done I moved on to implementing the ability to vote on polls.  There was no home for it but since it is interacting with a post I put it on the Posts Interactions endpoint rather than create a dedicated endpoint with just one method.  It aliases to a path in the same way as the rest of the interactions as well so I think it’s consistent.  That also required a little moving things around from the existing endpoint into a service and then having both calling that.  Since there were no tests around that capability I ended up writing those as well.  With that done it’s time to move on to the documentation and then start hitting up the OpenID review.

In Summary:

  • Incorporated suggestions in the paging in the API
  • Completed the Poll Voting method
  • Moving on to documentation updates

You can follow the status dashboard at this Google Sheet as well.

Diaspora API Dev Progress Report 23

After a day of coding the paging is now in every endpoint that should have it.  That means that we have paging right now for:

  • Contacts
  • Photos
  • Posts
  • Comments
  • Notifications
  • Conversations (but not messages in conversations)
  • Search
  • Streams

Because of the size of the code changes I would imagine there will at least be some tweaking and I could imagine there being some larger refactoring afterward too but it’s in a solid, working, and as performant space as the existing standard endpoints so I’m happy with it.

Now it’s on to the rest of the checklist.  With the scopes being rounded out I’m going to hold off on the security review for a little while longer.  The first low hanging fruit I’m working on is adding to the API Spec the ability to vote on polls.  It was an oversight in the original design but it should be easy to do.  I just need to decide which endpoint to add it to.  After that I’m going to double back to the mundane documentation update task.  At that point I think it’ll be time to go through and get up to my elbows into the OpenID code and get ready to make changes for the new scopes.

In Summary:

  • Paging is now complete and ready for review
  • Starting work on voting on polls through the API

You can follow the status dashboard at this Google Sheet as well.

Diaspora API Dev Progress Report 22

Paging paging and more paging.  I haven’t been committing as much time to development the last few days as I’d like.  Some of that is frustration with the development process on the paging, which has been a lot of trial and error.  Some of it is just how my schedule is working out too.  There is progress there though.  I have what I’d consider to be the rounded out API Paging infrastructure in place.  It has migrated a bit since the last update since as I tried to use it I wasn’t happy with it.  I’m still not happy with it but it is suitable.   There will probably be some additional tweaking before final integration but what it allows is for us to have paging.  I ended up wringing out design problems by wiring it into the Aspects Contacts endpoint method (to test index-based paging) and the User’s Posts endpoint (to test time-based paging).  With all of that working and unit tests I’m now moving on to adding it to the rest of the endpoints. There has also been some additional discussions on the permissions scopes for the endpoint as well, and I think we’ve converged on a good final set.

In Summary:

  • Paging API infrastructure modified to current MVP (I think) status
  • Paging API now used in the Aspects Contacts and the Users Posts method
  • Rounding out finishing the endpoints and updating the test harness

You can follow the status dashboard at this Google Sheet as well.

Diaspora API Dev Progress Report 21

Coming up with a paging infrastructure for the API while looking at all of the ways it could be used and abused hasn’t been fun.  Not that it hasn’t been totally worthwhile.  I’ve actually learned a lot more about some of the nuances of how ActiveRecord and related libraries are building up their queries. I’ve thought a lot more about the nature of the queries within Diaspora too.  At the same time my head is numb and for all of the effort I only got a half completed design and less than 100 lines of code across two classes, not that more lines is necessarily better.

So what we will have are two paginator types: index based and time based. The standard methods across the two are:

  • page_data: returns the current page of data for passed in query
  • next_page: returns information to go to the next page of data
  • previous_page: returns information to go to the previous page of data

The previous/next page functions will either return a new paginator object that corresponds to the next page or it will return a string that represents query parameters that can be passed back out from a REST endpoint.

Both paginator types take a query object that will then have additional paging stuff wrapped around it.  If one is doing an index-based query this is just wrapping the WillPaginate library.  However if one is doing a time based query then it’s a little more complicated than that.  We aren’t simply moving around indexes we actually are doing some time math.  That is all coded directly in the class.  The big difference between the two comes in how the ordering happens on the SQL query.  In the case of both you can pass in an ordered query without throwing an error.  However in the case of the IndexPaginator one probably wants to pass in their preferred order otherwise they’ll get whatever the natural order from the database is.  In the case of the TimePaginator it wants to keep control over sorting by whichever time field the calling code is using.  Therefore adding an additional sort could create confusing results.

Now that the paginators are done I need to add a present class that knows how to turn the query parameters into a “link” field with full URLs, per the API specification, and to update the services to call into and return the paginated data instead of their current form.  I think I’ll do one that uses indexes, like contacts, followed by one that uses time, like user posts, and then start filling it out the rest of the way from there.

In summary:

  • Completed playing around with the base pagination classes and completed them.
  • Starting to wire in first pagination into some first endpoint

Diaspora API Dev Progress Report 20

Now that we’ve hit feature complete status it’s about getting more of the legwork down to get us really ready for integration.  The first necessary feature we need before that is paging.  As I wrote earlier, some endpoints don’t need paging and all of them technically have it as an optional thing.  However to be really useful we need to have paging for several endpoints like posts, photos, conversations, et cetera.  It looks like we can leverage a lot of the way we do paging in the lower levels for streams and just create a standard pager class that the API endpoints that need it can use.  I’ve laid out how I want to approach that so now it’s on to implementation.

Along with the progress on the paging there has been progress on other mundane areas.  All of these features were developed in side branches which needed to be reviewed and integrated into the main API branch.  We are down to one endpoint left before the API branch itself is feature complete, not just having the code.  All of the branches are orthogonal except for the routes.rb file and the en.yml messages file so it’s pretty easy integration but needs to be done properly.  In the mean time we are also having discussions about the finer grained permission sets that apps will request and users will be notified about.  So for example, an app could be given permissions to only read posts but read/write comments on posts, and so on.  The endpoints already check for read/write tokens but they are broad tokens.  Part of the next steps will be putting in the proper requests and making sure that the information presented to users is clear.

In summary:

  • All but one endpoint is integrated back into the API main branch
  • Started work on the API Paging infrastructure
  • Looking at the finer grained permissions for each endpoint

Diaspora API Dev Progress Report 19

We’ve finally reached the milestone we’ve all been waiting for.  With the completion of the Search API Endpoint the Diaspora API is now feature complete.  That doesn’t mean that it’s ready for integration into the mainline branch.  It also doesn’t mean that there isn’t more fundamental work that has to be done before it can be used on a production system.  It does however mean that we can start working on rounding out some of the other fundamentals and make our way in that direction.

The first thing that I am going to work on is the paging aspect to the API.  The API spec discusses paging as a thing that endpoints may or may not do.  Right now there is no paging.  That’s fine for some things, like getting a list of Aspects for a user.  It is a requirement for something like getting a list of a user’s posts or for getting your stream.  For non-developers who are reading this think of this as the piece that makes your “infinite scroll” work.  Diaspora has implemented this in other areas but it will have to work a bit differently for the API.  We’ve already had discussions about how we want it to work and there is a format specification for reporting it back.  It therefore should be relatively straight forward to get it implemented.  That is what I’m working on right now.  After that we’ll want to go over all of the new code with a fine tooth comb for style and idiom consistencies (beyond the automatic style checker), security reviews, etc.  Lastly we’ll want to get the OpenID authentication/authorization/etc. stuff polished up a bit.  Currently the app has to be re-registered every day.  That’s not going to be viable for a real user even if it is for testing.

Still, the fact we’ve reached a feature complete milestone is great news and I’m excited to be ending the weekend on that high note.

In summary:

  • Diaspora API is now feature complete
  • Search API endpoint, unit tests, and test harness are complete
  • User contacts endpoint implemented completing that endpoint
  • Beginning work on paging infrastructure for API endpoints that need it

To follow along with status please see the Google Sheet Dashboard.

Diaspora API Dev Progress Report 18

After the long-winded post a few days ago on the API Status the latest update is pretty brief but important:

  • Notifications API endpoint, unit tests, and test harness are complete
  • Work on the last endpoint (search) has begun.

Diaspora API Dev Progress Report 17

The last couple of days has been a lot of heavy effort of slogging through some ever increasingly complex changes to get the API going.  I started with what I thought was going to have a relatively easy time with the notifications however the deeper I went into the more I realized that I either had to come up with some relatively (for me anyway) complex queries to populate some of the return types or I have to settle for some N+1 type query behaviors.  “N+1 queries” are one where you pull the results one piece at a time.  That’s fine for smaller data sets, like five or ten or something, but if you are dealing with hundreds of entries you are really thrashing your system.   So I got about half way through the notifications API and then put it on the shelf and moved on to the API was dreading the most: Photos.

I was really psyching myself out about having to deal with the whole image file upload part of the Photos API and then the subsequent tie in with the Posts API.  It shouldn’t be that complicated but these are things I had never done in Rails or with the Kotlin Fuel framework.  How would they interact?  How difficult would the security checks be?  You get the idea.  It did take several hours of figuring out what the current controller is doing and then how I wanted to refactor the more complicated operations into a service but I got there.  Once I had that I had to test the whole aspect of limited posts et cetera, which I hadn’t done as well as I had thought previously.  Thankfully my Ruby unit tests were solid I just had some hiccups in my test harness.

At the end of the day we have the Photos API and the Posts API working with the photos perfectly, to the point where I was able to make a fully populated post including with an image that was uploaded externally as well.  That means I’m going to jump back on the Notifications API to wrap that up and all that’s left is the Search API.

In summary:

  • Partial Progress on the Notifications API but shelved to figure out queries later
  • Posts API is feature compleet with full tests
  • Was able to create an entirely populated post with the respective images from scratch using an external application for the first time ever in Diaspora (see this post)
  • 1.5 Endpoints left to go to be feature complete

Diaspora API First: A Full Externally Created Post

After slogging away for most of today on the Photos API, with lots of needing to understand how things work and a couple more tweaks before it was ready, I decided to celebrate by showing the ultimate progress report: a screenshot.  What is so special about this screenshot?  It is the first post in Diaspora that has been fully made by an external application.  The “external application” in this case is a test harness written in Kotlin which is designed around the API spec.  This test harness first uploaded the image file, then it created the post with every feature a post can have including: location, polls, and references to other users.  The post was written by a “user3” (for testing might as well stick to simple names).  This is a screenshot from user1’s perspective.  Notice that they also got the expected notification.  Yes it’s still a bit of a ways from done but it’s still a great milestone, so I’d say it’s time to celebrate for a bit before getting back to it :).

Diaspora API Dev Progress Report 16

Brief update from today on the Diaspora API development progress:

  • On the Users API turns out we probably still want to have the contacts endpoint if only for the primary user since the Contacts API works on a per-aspect level the way it is mapped.  Whether that method shows up in Contacts API at a different mapping or on the User itself is still TBD but it will be a change to the spec.
  • The Post Interactions API is feature complete with full tests and the completed test harness.
  • Work has begun on the Notifications API.  This is the first change I’ve done that will require a DB migration, adding a new GUID column to notifications, so this is going to take a bit longer for me to complete as I do background research on that.

At this point it’s actually easier to look at what is left to do versus what we have done (which is a huge plus sign):

  • The only two endpoints that haven’t been touched are Photos and Search. Once these are done (along with work on Notifications) the entire API spec will have been implemented.
  • Implement a new poll interaction method for answering a poll through the API
  • We need to implement paging on several of the endpoints.  This technique will be similar to how it’s done in the core controllers but it has to be different because the return type needs to have the next/previous pages and the corresponding format needs to honor that.  The actual mechanics of the queries are pretty much the same though so grafting them into the existing feature complete controllers should be relatively easy.
  • Right now the OpenID integration works well enough for testing but it currently requires revalidating the app every 24 hours.  This has to be tweaked to be more reasonable.  There may be some refactoring in there as well.
  • The Posts API Endpoint accepts any photos currently, including those that are already attached to another post.  This is not consistent behavior and has to be corrected to only allow a “pending” photo to be added.
  • Sweep of all of the APIs for consistency on security, service initialization (where appropriate), params parsing idioms, etc.
  • Sweep through the unit tests to make sure that edge cases are covered in the same way
  • Documentation updates to account for things discovered during the development (error codes added, format tweaks etc.)

Diaspora API Dev Progress Report 15

It’s been two weeks since my last Diaspora API Dev Progress report but that’s not because nothing has been going on.  Between the RubyConf 2018 attendance last week and this week being a holiday week there was definitely a drop off in how much development time I put into Diaspora, and therefore mostly into the API.  However over that time there has been some development progress:

  • All of the previous work has been successfully merged down into the main API branch.
  • The Contacts API is feature complete with full tests and the completed test harness
  • The Users API is feature complete with full tests and test harness with the exception of the User Contacts API method.  That method was supposed to be able to return another user’s contacts if that user allowed that.  However that feature no longer exists in Diaspora so I believe it is extraneous.  If that’s agreed upon then this is feature complete and ready to go.

This week I should be able to apply a lot more development effort than I have been able to the past couple of weeks.  Hopefully that translates into forward progress on some more endpoints.  The trend seems to be that they are getting more difficult to knock out so my velocity is slowing.  I guess it’s better than being stymied in the beginning.

Diaspora API Dev Progress Report 14

Yesterday was the first day in several I could commit to real time towards D* again.  After getting back up to speed and making the status post I went on into the API development again.  I was able to make some good progress on some brand new endpoints.  The first one I worked, which is the first that needed from scratch coding of the main code, was the Tag Followings controller.  The day before I had struggled getting Rails to make the POST for creating tags work against the spec.  However after talking it over and thinking about it it was the spec that needed changing.  In another software framework I could just make it work but relying on the auto-wiring in Rails brought the design flaw nature to light.  With a simple change starting yesterday real development of the Tag Followings endpoint started.

The methodology I’m using when developing the new controllers is as follows.  First, I want to get the basic infrastructure in place and the tests.  That means that the first phase is to write the skeleton of the controller code, the skeleton of the RSpec tests, and to wire the two together.  I make sure that the routes behave the way I think they should according to the API Spec without worrying about returns etc.  The skeleton of the controller should implement all routes.  The skeleton of the unit tests should be testing for happy path and reasonable error conditions.  So that’s stuff like: the user passes the wrong ID for a post that they are trying to comment on, or an empty new tag to follow, etc.  I then go over to the external test application and code up the corresponding code in there as well.  With everything running I make sure that the endpoint is reachable from the outside (which it should be), but don’t worry about returns, processing etc.  If it’s possible to setup fake returns easily I do that otherwise I just ensure the proper methods are called.  After all of that is coded and committed then it is off to filling in the controller method by method.  For each one coded up I complete the unit tests and the external test harness interactions as well.  Once that’s all done then I move on to the next one.  In some cases, like Tag Followings, there needs to be refactoring elsewhere which has implications on the above flow.  I usually do those pieces before coding the controller.  It is at the design time that whether I should be using common code with another controller which may not exist as a Service component becomes apparent.  If I need to make any changes over  in other code I check that there are unit tests which properly cover the changes I am going to make, at least as best as I can tell, write those and then make the changes.  This should minimize the possibility of disruption.

When interacting with Frank R. on the merge requests one of the pieces of feedback I got was that with everything compressed down to one commit it was hard to tell why I did certain things.  As I code all of that is there but I’ve been rebasing everything down to one commit per endpoint so that when it comes time to merge the API branch into the main develop the log will look something like: Post API endpoint complete, Comments API endpoint complete, etc.  To get around this I’m trying a new flow.  When I think something is ready to be merged i’m doing a Work in Progress (WIP) Pull Request (PR).  That PR has the raw commit history and the name “WIP” in the leader of the label.  After a review and a thumbs up I’m going to rebase it down to one commit and then submit the final one for integration.  By the time WIP is done the code is feature complete however and should be ready to be merged.  I’m therefore counting WIP PR’s as the threshold for saying something is feature complete.

With all that said the three new endpoints that were feature complete as of yesterday are: Tag Followings, Aspects, and Reshares.

Diaspora API Dev Progress Report 13

After a week of distractions I finally have a new update on the progress.  We’ve successfully merged all the work done to date into the one main API branch and are now working on new features moving forward.  The first feature we have completed with full tests and test harness interaction is the ability to manage and work with the user’s followed tags.  So we have the full post lifecycle from before, and now tags done but not merged into the main branch yet.

Diaspora API Dev Progress Report 12

The merging of the various side branches into the main branch is coming along.  Because this isn’t being done as a primary job there is a bit of an expected delay between the pull request (PR) being generated and the branch being merged in.  This is giving me the opportunity to work on other features on Diaspora though.  The process is going along much faster than I expected it to, which is good.  At this point we have merged the Likes, Comments, and Post Endpoints together.  The PR on the Post Endpoint is now queued up however all of those changes exist in one branch.  What that means is that I was able to perform a full Post life cycle test using the test harness.  This means that we have an external application talking through the API and doing the following for a user:

  1. Creating a post
  2. Querying for the post and printing out it’s data
  3. Adding a comment to the post
  4. Liking to the post
  5. Printing out the comments and who liked the post
  6. Deleting their comment on a post
  7. Unliking a post
  8. Deleting a post

This is a very important step. Follow additional progress on the API Progress Google Sheet.

Diaspora API Dev Progress Report 11

It’s been a few days since I’ve been able to put some real time into Diaspora development but I’m back today.   Being back home from travel too means I can finally get past the blockers on the other branches.  I’ve actually gotten all of the branches I had been developing on to feature complete status, with full tests, and the test harness fully coded against it.  That means that through the API one can complete the entire Post, Comment, Like, etc. lifecycle for posts with all data types (regular, Photos, Polls, location, etc).  Conversations are also feature complete with full test harness as well.  Streams are also complete, however I haven’t tested with sufficient post volumes to test paging behavior.  Now it’s going to be the trick of getting past the tech debt of getting them merged together into the API branch.  Hopefully that’ll come in the next day or two.  I’m going to spend some time doing other Diaspora stuff besides that as I work through those pieces as well.  As always follow the progress on the API Progress Google Sheet.  After the merge I’ll be moving on to the Tags Endpoint, the first endpoint that is a full from scratch development for me.

In Summary:

  • Fully feature complete endpoints with full external test harness interaction completed are: Comments, Conversations, Likes, Posts, and Streams (except for paging behavior).
  • Ready for merging of the side branches into the main API branch

Personal Reminder: no one has a right to your time

Life is actually a very short finite thing.  Each day there are only so many waking hours of which one can only pour in so much energy.  Do you decide to pour it all into useful work, spending time with family, spending time doing nothing but watching television or playing games, or whatever.  The bottom line is that we have to decide how we want to expend that in a way that will make us as contented as we can be.  We will miss the mark obviously but that doesn’t mean that one has to engage in behaviors that they know are moving opposite that direction.

Diaspora API Dev Progress Report 10

Even though it was another short day on the road it was a productive day.  The Conversations Endpoint’s Messages method got completed shortly after I typed up the previous day’s status message this morning.  I then jumped onto the Streams API.

Diaspora API Dev Progress Report 9

I’m still on the road so my contributions aren’t as great as I’d like them to be but I did manage to make some progress on the API development.  At this point Conversations Endpoint minus the message listing of a conversation itself (next up).  The test harness is coded up against the Conversations such that it can create, read, and hide/ignore them.   As I finish up the Conversations Endpoint work and wrap up the Posts Endpoint work when I get back home I will soon be leaving the world of reviewing the existing implementation done by Frank while augmenting the tests, writing test harnesses, and making changes to get all of the tests to pass.  I will then be entering the world of from scratch development on the rest of the API.

Diaspora API Dev Progress Report 8

While I’m on the road I’ve been hoping to get some more work in on the API.  Yesterday was a bust, and I knew it would be.  Today looked like it was going to be a bust but I actually was able to get some time in tonight due to some plans that were cancelled last minute.  As I sat down to start working I realized that I hadn’t been quite as prepared to develop on the road as possible.  Before leaving I made sure my development laptop Ruby VM was fully configured, could compile the main code and the Kotlin test harness.  I was all good to go!  Except, I forgot to push my work up to the GitHub and Gitlab.  Oops.  Well, that derailed continuing work on the Posts API Endpoint, but with plenty more endpoints to go I started up on the Conversations endpoint, the next most filled in one to start from.

I did make a good amount of progress of fleshing out the unit tests and making some code changes to make the requests and returns on the Create method to correspond to the specification.  It was at that point I realized I didn’t quite test my setup even further.  I didn’t have a registered application in my OpenID setup on this dev instance.  I also didn’t have the configurations I used when I set it up on my main development machine either.  After some fumbling around I did manage to get it registered so I could then start testing the external test harness against the endpoint.  After some final code tweaks I got that up and running and now have the test harness generating new conversations between two users!  On to the rest of the conversations API tomorrow!

Diaspora API Dev Progress Report 7

I’m still making good albeit slow progress on the Posts Endpoint.  While the Posts Endpoint doesn’t have a lot of methods the complexity of the send and the return data is far greater than the other endpoints I’ve done so far.  Posts have more than just text.  They can have polls, geolocation data, mentions, aspects management, and photos.  Yet posts are the core of the whole system.  They are the digital elements we interact with the most.  So progress on this endpoint is crucial.  I’m pleased to say that at this point I’ve made enough progress with the unit tests and the test harness that from an external application I have been able to do have an external program do the full lifecycle of posting: Create a post, read a post, comment on a post, and like a post.  I’m pretty stoked about that!  While I have the full complement of all post data available on the GET method tested, I still have to create the test harness test methods around pushing posts with ancillary data (location, polls, mentions, photos), and need to write the unit tests for photos as well.  The Photos endpoint for uploading photos during a real post creation process is a whole other matter though, but we’ll get to it soon enough!

Diaspora API Dev Progress Report 6

Today I didn’t get as much progress as I had hoped on the API but still important work was done.   Yesterday I discovered that something was probably off in the way that the repository rebasing was done when I did it about a week ago.  Today I confirmed it.  Working with Benjamin Neff (SuperTux) I was able to figure out a path forward for correcting the problem.  While the git commands are pretty straight forward, me being comfortable that I’ve done it correctly is another matter so I did the process three times in a row. Each time I looked at the corresponding git log afterward and did a three way diff of the API branch head before the new rebase, the API branch head after the rebase, and the main Diaspora develop branch.  I may end up doing it a fourth time (or reconfirm this last time anyway) before doing a final push after talking with Frank about it.

After getting past that I spent the other half of the time making actual progress on development.  Thanks to Dennis Schubert’s (DensChub) efforts we were able to make some progress on some API questions I had.  After that I made changes to the respective implementations to make it consistent.  Then I went back to the Posts Endpoint testing.  I completed the full GET path happy path testing for simple and fully filled in posts (text, photos, polls, mentions, and location).  I now have to add in failure path testing on the GET, and the corresponding test harness methods to complete that and move on to posting and deleting Posts.

Diaspora API Dev Progress Report 5

Another day another progress report on the state of the Diaspora API development.  I had hoped by now that I’d be picking up a little more speed but I always underestimate how minute working on high coverage unit tests are.  If I was doing a whack it together MVP startup-mode app I would always put automated tests around it for my own sanity but since things are going to change, or maybe even get thrown away entirely, in relatively short order there’s no need to go gnat’s ass down to the details.  That’s not the case with the API.  Yes the API is technically in a draft mode but it always looked like a really good draft.  The more I code against it and use it the more I believe that’s true.  Yes, my development speed is increasing as I become more familiar with all the technologies and get past some more technical hurdles but it might take the better part of a man month to finish this up (which is maybe a man-week more than I originally eyeballed).

The progress though has been steady.  I had a hiccup late last night with my test harness.  The Fuel HTTP library I’m using in Kotlin pushed a new release that requires the 1.30 version of Kotlin, which apparently is harder to come by than I thought.  Manually setting the version fixed it all but not until after I spent half an hour fumbling around with it before giving up.  Today was the deep dive into the Comments endpoint.  As was the case with the previous Likes endpoint Frank’s previous work left a very solid base.  Fleshing out the tests for some different errant behaviors, testing error messages as well as codes, and finding problems with the interactions once the test harness interacts with it over HTTP were the usual gremlins to squash.  Still, with only two more mostly fleshed out end points to work with coming from Frank’s code base, I have a feeling that the development pace will be slowing down.  Maybe I’ll have gained sufficient efficiencies in my speed of coding on all of these to make up some of that difference.

Along with the above gremlins now that it’s being interacted with I am seeing some potential small grained details that need to be discussed about the API.  That’s all tracked in the issue tracker on the API documentation page though.   Again, this is solid work by the team putting the API together and Frank’s initial code base that I’m starting from.

In summary progress for the day:

  • Comments API Endpoint is finished and ready for pull request
  • Test harness example of interacting with the Comments API is completed
  • Some Issues were submitted to discuss minor changes to the status reporting back from the REST services on things like what happens when a Comment ID doesn’t match the Post ID that the REST endpoint was called with.
  • Some small documentation touch ups to address navigation

Diaspora API Dev Progress Report 4

Being in the early phases of getting the implementation started it was inevitable I would encounter a little extra inertia to overcome.  Part of that is my own doing, but all of it is important to have confidence in what I’m developing.  The easiest part was filling out the API Implementation Stoplight chart so everyone, including me, can track what is going on with the development.  Then it was on to a fork in the road of sorts: do I want to start an external test harness now or wait until more is implemented.  I decided for former. 

Diaspora API Dev Progress Report 3

While I made progress with a few hours of Diaspora API Dev yesterday it wasn’t until today that I finished my first code change towards the API: completing the Likes Endpoint.

Diaspora API Dev Progress Report 2

Yep, two Diaspora API dev reports on one day.  After taking a break for dinner and just watching some TV I got back to figuring out how to properly interface with the authentication and API from an external client.  I was re-reading the OpenID spec, watching some videos, reading some presentations, et cetera.  If I’m going to be working on the API this is something I definitely need to be deep diving into a lot more.  My initial order of business however was just getting it working.

Diaspora API Dev Progress Report 1

I’m only a few hours into getting fully going on the Diaspora API development project.  I had been pre-flying that whole experience earlier last week by studying the existing code base, familiarizing myself with the discussion threads et cetera.  Over the last couple of days I’ve been trying to focus more on moving the ball forward as well.  Before really doing that though there is still a little ground work to do.

Milestone: Higher Responses on Diaspora instead of Facebook

The Cambridge Analytical debacle from earlier this year and the subsquent #deletefacebook storm brought me into the alternative social media platform Diaspora.  At the time, as I wrote here, I had hoped to leave the walled gardens forever.  Initially I did just that but practicalities changed that forced isolation quite a bit.  In some cases, like DDG, I’m still 99% using the open alternative.  In others, like YouTube, I’m mostly using the old system because I just can’t get what I need out of the alternative system yet (although I still try more and more every week).  However for much of it, especially on the social media side, it’s more of a mix.  I’m on Diaspora as much as I’m on Facebook.  I’m on Mastodon more than I’m on Twitter, but that was always a small platform for me versus my usage of Facebook.  The best way to think of this blend for me is that I try to make Diaspora and Mastodon my primary platform and Facebook my secondary one, with Twitter being a distant third.

What that means practically is that I’m pretty much logged into Diaspora, Mastodon, and Facebook continuously throughout the day.  The first places I’m posting to are Diaspora and Mastodon.  The first places I’m checking posts is Diaspora and Mastodon.  Most of the new activity from me is on Diaspora and Mastodon with manual cross posting, thanks again Facebook for screwing up your API permanently to prevent external posting,  when I want to share the same thing on Facebook as well.  Because I have  just over 1000 friends on Facebook and almost all of them are people I’ve interacted with in real life (most mere acquaintances or met once at a social function or something) there is just a larger volume of relevant and more personally resonating posts from others I interact with.  So if one were to look at my activity feeds and notifications on a given morning when I start the day you’d see tons of activity on Facebook and a little activity on Diaspora and Mastodon.  Today was different.

Today the equation was reversed.  Today I had more interactions to wade through on Diaspora.  I had more relevant interactions to wade through at that.  I had more notifications to wade through.  I even got comparable engagement on my cross-posted material from late last night on all three systems.  That’s the first time that’s happened since I went back to having a foot in both worlds!

Is it that I crossed a tipping point in people I’m connected to on these alternative social media systems?  Is it that the influx of Google+ users have caused a spike in engagement across the systems in question?  I don’t know the answer to why, and this will probably stay a noteworthy exception rather than a rule moving forward.  However it can’t be a bad sign, except in one way.  In the span of how long I’ve been writing this article, which is a free association lasting 15 minutes, I’ve already received almost ten notifications on Diaspora.  I know that the notification controls are not as fine grained on Diaspora as they are on Facebook.  It’d be a great problem to have to need to tackle that sort of feature request in the near future :).

Let the Diaspora API Deep Dive Begin!

I can’t express how happy I am that I have the privilege of having a combination of time, ability, desire, and energy to contribute substantially to the Diaspora project right now.  Ever since I started using it in the spring it’s something I’ve wanted to be able to help with.  I certainly got my feet wet back then on some tweaks to the Twitter and Facebook interaction code, the latter of which is permamently broken thanks to Facebook’s new API spec.  With the amount of getting up to speed on Ruby, Rails, and the Diaspora code base I’m looking forward to helping tackle a much larger and persistently requested piece of code: a Diaspora API. 

Ramping Up Open Source Development Time

I’ve mostly been “microblogging” updates on Diaspora recently.  That’s a fancy way of saying I haven’t been doing any in-depth writing but instead just making quick ad hoc posts on social media.  As I am now ramping up my development on open source projects, primarily Diaspora by the looks of it, I’m hoping to start posting here more frequently capturing new lessons learned, observations from  my exploration of these newer languages and code bases, and just getting more writing in.

Over the summer I actually spent a good deal of time exploring different cross platform development frameworks of the .NET and C++ variety.  That was intended to be to work on a very niche open source project idea that I had conjured up around my classic computing hobby.  By the time I made enough progress on that to the point where I could potentially be productive, although I still want to explore wxWidgets a bit more, the bug to help on alternative social media platforms bit again.

So while I’ve been pining away for the opportunity to really start getting into Kotlin, JavaFX, and other technologies, my current path is taking my down the jewel crusted path of Ruby, Ruby on Rails, and JavaScript.  These are the technologies that Diaspora is built upon.  In fact, as I’ve written before elsewhere, I’m really enjoying the language a lot.  RubyMine could use a bit of polish compared to how well it works for Java and Kotlin but it’s at least on par with the CLion C++ and Rider .NET IDEs.  Yes, I’ve fully converted over to being a JetBrains user nowadays, even paying for a full license to the entire suite.  To people who know me the fact I converted over is probably going to be a bit of a shock.  To the casual reader coming here from my non-software interests they have no idea what we are talking about, but IDEs are very personal decisions and we get wedded to them pretty hard.

Sorry for the absence.  I hope to be a regular poster again for the half dozen of you that actually read this!


Adjusting to not-as-social social media alternatives

I’m now three weeks into picking up and using non-walled garden social media systems instead of traditional ones, specifically Diaspora over Facebook and Twitter.  It has mostly been a good experience despite some major disagreement on some of their decisions on user experience and other rough edges that I hope to help fix soon as a contributor.  But the thing that puts social media apart from blogging or other static production ecosystems is the concept of sharing and interacting with other users.  By the nature of the the fact these massive digital halls are still pretty empty I’m just not getting my fill of that.

Completing leaving the user data selling walled gardens

Over the weekend I had made a bunch of progress on migrating away from the walled garden systems.   I’m happy to report substantially more progress.  This will of course be an ongoing process of refinement and testing.  However I’m currently getting substantial amounts of my needs met in enough areas that I’m prepared now to start pulling the plugs on Facebook, the Google Ecosystem, Twitter, and so on.  When I wrote about this over the weekend I had completed my hypothetical replacement of several systems.  I have some updates to those elements as well though.   My current replacement portfolio looks as follows (summary at the very end):

Progress on leaving the user data selling walled gardens

As I wrote earlier this week after the Cambridge Analytica event came to light my nagging feeling that I needed to get off these Facebook, Google, etc. platforms crossed a threshold.  It was no longer something that I thought I should do but something I was going to actively do.  In one week I’ve made progress in pretty much every dimension (scroll down to the bottom if you just want my list of alternatives).

Picture of Me (Hank)


Updates (129)
Software Engineering (127)
Journal (119)
Daily Updates (84)
Commentary (74)
Methodology (60)