Diversion: Online site details, observations, and gripes

  The central website running Arisia '21 was a monumental undertaking, and the people behind it have a lot of my respect [and sympathy] for pulling off a complex event in what was ultimately a smooth and integrated way.  Sure, there were some missing pieces and a few hiccups, but you likely won't find any online endeavor on this scale entirely free of any of that.  As Justin, one lead of the "Remote" design team, put it in his own post-con summary:

  ... the underlying principle of Remote was that it was structured, very consciously, as a small software company.  We were building an online convention platform, focused on a specific instance of Arisia, but generally doing so using the standard principles of modern software development.

Most components of that were a swirling cloud of buzzwords from my backward view -- Scala, Angular, Play framework, Typescript -- and personally I had only recently smelled PHP and Python from a distance, but hadn't really got into working with them much.  Not to mention that I had only started poking around Github maybe a month beforehand.  There wasn't really time enough for me to come up to speed on any of those things to usefully help.

However, I fell into my usual patterns when it came to evaluating the work product, particularly from the standpoint of resource containment.  Websites frequently do pull components from third-party repositories and content-distribution networks, partially for speed and partially for code that isn't maintained in-house but serves some desired function.  This has its downsides too, such as implicitly extending one's security perimeter out to the practices and competence of those third parties, and some degree of sketchiness certainly exists in those areas.  This is why it's *my* standard practice to note such things and at a minimum, ask content providers "hmm, did you really want to do that".


3rd party repository calls, dammit So once I finally got onto the convention website, my first instinct was to look at some page source and immediately got that "oh no" reaction from seeing some very typical third-party callouts.  But at the same time I noticed something I hadn't before: the "integrity" argument, which is more properly called a subresource, and is part of the cross-origin resource sharing spec, intended to at least sanity-check what a CDN hands out.  Except that it's up to the local browser to actually check that, so it's more overhead on the client side if it even supports the mechanism.  And where is one supposed to gather those "verified" hashes in the first place?  As part of a still-experimental API at the time, it wasn't going to matter much, but more importantly there wasn't time to try and pull vetted copies of all of these resources in-house and serve them directly.  If online.arisia.org was a bank or medical portal or tax-payment gateway or anything dealing in critical personal data, that would be a different story.  But it wasn't, and about 0.1% of the user base actually notices this sort of thing, so we plunged onward.

reports: 3rd-party MEH and the prelim cat/dogfact drop Still, we had a little discussion about it, and Justin acknowledged that it wasn't ideal -- especially given some recent high-profile news about code supply lines and trust relationships.  I was also here to report my own extension of trust to third parties but for far more lighthearted purposes, described later in the "pet quotes" rathole.  When you think about it, using Discord and bot infrastructure is about as third-party as it gets, especially considering that the price is right.  So heck, you could argue that Remote's "conferencing platform" was farther along the path to security righteousness than the stuff I was messing with.

ducks: really to show NoScript on 3rdparties To use the online site productively I had to do the usual enabling of various script suppliers, both here in NoScript and in the proxy layer which brokers overall access to domains as a whole.  This is one reason I keep all my browser configurations in a 100% volatile ramdisk-based setup that can easily be zapped back to a known state.  I think by the time the con was over I had four different Firefox "profiles" at hand, all set up to bless different sets of stuff as nominally trustable.

quickly devolves into language discussion Back-channel nattering about this stuff quickly devolved into language geeking, because that's how some of us just are.  But seriously, this is the kind of conversation we'd be having almost anywhere in real life, whether it be while unloading a truck or slinging cable in a ballroom or huddled in a corner of a party room.  As ethereal as the subject matter could get, this was a very human level of interaction.

last year's test data JSON object, omgwalloftext So along the way I took a superficial look at some of the site code, since it was all up on Github for perusal.  The convention schedule backend was very similar to KonOpas, which uses a one-shot download of ALL of an event's data which it can then work with from offline browser storage.  The data is extracted from the programming backend and represented as a massive JSON blob -- even the big-pic here is only part of it.  One of these was present as a test object in the codebase, except that it was taken from *last year's* data.  I had the thought that maybe finding this year's "blob" and prettifying the wall of text out to something more searchable would enable me to find items with tools I'm used to, such as who the panelists were for a given session.  That turned out to be harder than I thought, because the URL to fetch it wasn't obvious.

main item dispatcher HTML: cached JS blobs This snippet appeared to be the main dispatcher from the site's landing page, linking out to various other functional areas.  Amusingly, one of those Discord sequence-of-big-numbers links to a channel is hardwired in, which would have led to that same "limbo state" view if someone following it hadn't been properly gated into the Discord yet.  The "/events" page contained an embed of whatever Youtube or Twitch was running at the time, with a link to go directly to the native player view if desired.

Yes, I was looking specifically for more parts that dealt with Discord...


demoing session-only cookies, reply is 'why ever close tabs' Another annoyance about the site was that all the login cookies were volatile, or session-only.  I don't leave browsers open for weeks at a time like some people, so I kept getting kicked out any time I restarted it.  For a site I'm expected to stay logged into all weekend to perform my various support duties, that doesn't make sense, considering that I generally clean up before putting things to bed for the night, and any number of other sites can still remember who I am regardless.

The answer from other folks:  "Why would you ever want to close a tab?" 

Hmmmph.


  One of my Zoom-hosting slots was for running a webinar, and confused reports had already come back into the tech-ops channel from someone else who also had a webinar and couldn't find the typical "start broadcast" or "go live" button.  Because there wasn't one.  The webinars were apparently launching out of the backend without the "practice session" enabled, which was the complete opposite to the workflow that several of us had been doing while hand-launching webinars in ALL of our prior events.  The sessions started "hot" to the world, and simply relied on attendees not having the join link until about five minutes before start time.

Misunderstanding of how webinars work This seemed extraordinarily short-sighted to me, because it had always been up to a host's judgement exactly when to go live or when to hold while panelists weren't ready, the bumper-slide wasn't up, or whatever.  And waiting *five* minutes with a bumper slide up and panelists asked to stay quiet, while attendees to filter in, is an eternity compared to the minute or less we usually ran with.

I took this up with Justin, who didn't even realize it was a problem.  It was far too late to fix it, too, so fortunately I knew all this going in when it came time for me to run my webinars.  Luckily we didn't have too many of those; most sessions were done as meetings which started with reasonable settings.


excuses for not fully handling webinar I found out later that it wasn't entirely Justin's omission; the Zoom API reference describes some fairly stupid defaults for webinar creation, and implies that the overall settings that one laboriously grinds through at the web portal don't even come into play when launching via API.  Way to go, Zoom, gratuitously throw off your subscribers' workflows just because they were trying to do something a little more elegant.

  Overall, though, the site did a great job of pulling together lots of critical information in one go-to place, making it pretty easy for everyone to find what they needed quickly.  In less than a third the time that the project should have taken, the Remote folks spun up their magic and got it together for the good of the community, and for that they should be rightly proud.  Never mind that a few bits might have been missing or stubbed off here and there; the audience doesn't see what the audience doesn't see.  The numerous accolades that rolled in afterward speak for themselves.

This leaves a real dilemma to ponder, to weigh the merits of either a> wanting to carry stuff like this forward and turn it into something really stellar for next year, versus b> desperately wanting to never have to do this again and to simply get back to normal.  Thought trends seem to point toward future events having significantly more online presence regardless, but until people start fully realizing the work factor involved in "going hybrid" that may only be lofty fantasy so far.


_H*   210129