Mark Poepsel – SIUE THATCamp 2016 http://siue2016.thatcamp.org Engaging Communities Through Digital Humanities Thu, 16 Jun 2016 20:04:50 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.12 Maker Entry: Campaigndium http://siue2016.thatcamp.org/2016/06/12/maker-entry-campaigndium/ Sun, 12 Jun 2016 04:27:31 +0000 http://siue2016.thatcamp.org/?p=352
Mockup of Pitch for "Campaigndium" - a social filtering platform-tool.

Mockup of Pitch for “Campaigndium” – a social filtering platform-tool.

Only one person came to my session, but we still came up with the outline of an idea that could be revolutionary. I’ve been kicking around the idea of journalism as campaign for a couple of years and never publishing much because it’s not a well-defined concept yet. The gist is that the role of journalists is to cut through glut more so than it is to gather information and serve it up to people in some kind of farm-to-table model. That’s the old way. It struck me in hearing about omeka and in hearing about map building that the Digital Humanities is about both gathering hard-to-find information and cutting through the glut. You go find hard-to-find items. You find more of them than you need. You store them in the “basement” or some cool Indiana Jones-ish warehouse and then you put together exhibits of the best of the best or the most appropriate of the best based on whatever the exhibition is about.

So journalism (and filtering kid-oriented videos on YouTube) could do the same thing. Find too much of a good thing but certainly not all that is out there and then curate (journalists are falling in love with this term as if it were a new idea) the crap out if (heh, literally).

So that leaves the question of “How?” and that’s what our little twosome came up with during my maker session yesterday. What about match.com or lumelle (which I just Googled 15 minutes ago, tbh)?  What if you filtered your “crew” first and then made a campaign out of your search for good stuff. Made the search a combination gathering and filtering effort with the goal of building a “filter pool” or something that sounds less “above-ground-ey.” And then from this pool of vetted resources gathered in a crowd-sourced or maybe even gamified way (this really smacks of MMORPG with little questing groups, right?), you’d get what you need based on some well-established parameters agreed upon when you start that can be changed but not easily.

So, what you’ve got is a platform for campaigns that can focus more on culling through huge amounts of readily-available stuff or that can focus on gathering really rare things – indigenous language oral histories, for example. And either way you get this pool you can use to make exhibits or news stories or playlists or whatever.

It’s social filtering, shared sourcing, and at human discretion. Algorithms are often biased based on the culture of the people writing them. They privilege popularity over quality. They can’t find what’s not already online and properly tagged or linked. All they can tell you is what everyone else is looking at and what the people you already interact with a lot are doing and saying.

Social filtering, human filtering is a tool that needs creation and refinement and the image shared above is a good pitch to get funding to make this kind of platform-tool.

]]>
Make Session Proposal: Creating a Community Curation Platform for Child-Appropriate Web Videos http://siue2016.thatcamp.org/2016/06/10/make-session-proposal-creating-a-community-curation-platform-for-child-appropriate-web-videos/ Fri, 10 Jun 2016 18:06:06 +0000 http://siue2016.thatcamp.org/?p=330

Vs.

My son is four years old, and though I hate to admit it he watches a lot of YouTube. It’s our go-to diversionary tactic for short- or medium-length car rides. This means he might watch for 20, 30, even 45 minutes at a time.

I would like to curate a list of appropriate videos for kids aged 3-5 years for parents who allow that sort of thing but who want some level of say in what their kids watch.

I’m also proposing to document that list in a blog so parents can see why each video has been approved.

Ultimately, I think we’ll have better tools for this kind of process in the future – sort of networked sharing where algorithms take a back seat to friendly curation by organized actual humans.

There are some pitfalls to watch out for. Obviously not everyone approves of the same videos. What goes in and what’s culled is going to be up to us. We could set guidelines for our choices and parameters for the scope of what we’re going for (top 100 list or just a list of anything of value for which we can make a good case that it should be shared?), but you can probably imagine how tastes vary and evolve and how rules are made to be broken. You can also imagine how someone might try to hijack an effort like this as a prank or as a political act.

So, we should also take this as an opportunity to discuss how difficult human curation may be but also how important this kind of curation might be given that algorithms aren’t infallible.

Algorithms are making media decisions for us, and though we may not always make better choices and we can’t compete with these types of media controls in terms of scale (at least not yet), we might build curation networks around our most important concerns, starting with what our kids are exposed to and what they’re learning from.

Here’s the image I want to leave you with. I will sit down with Sammy and watch some interesting videos about monster trucks each painted a different color or carrying a different number on its side. To me, this is fine. This is like Sesame Street – bite-sized lessons about basic building blocks of communication and mathematics presented in messages that a young child can easily consume.

–But (and you knew that “but” was coming) after three or four minutes of somewhat educational videos playing for Sammy in his car seat, I hear dinosaurs roaring and explosions going off and come to find that there’s a fire breathing dinosaur (as if!) battling a monster truck in a swamp all made hastily with cheap graphics carrying information that is confusing scientifically and of negative social value.

I mean, sure, sometimes a dinosaur has to solve his problems with violence, but tail smashes should be the last option.

So, can we use the power of a small network of smart people to curate a list of videos (perhaps a YouTube playlist) and then can we justify in a paragraph or two (perhaps through a shared WordPress site) the videos we like?

If we can do that, THEN, can we think of a platform or a process whereby all kinds of groups might be able to set up these lists? Can this be done in meaningful ways that aren’t already covered by Facebook, Tumblr, and Twitter and with relatively easy-to-use interfaces for both the curators and the “audience”? What is our audience? Do we have one? How big might it be?

Finally, what are the ethical implications of this level of peer curation? Are we building a system to create echo chambers for kids? Might parents try to use this to limit what their kids are exposed to in ways that are relatively negative? (I’ve built my grandson Mason an ALL TRUMP CHANNEL so he can learn early about what’s really going on in this country.)

Do we worry about that when trying to build this kind of network tool?

I’m certain there’s a use for this. It definitely comes with caveats, but this is my proposal that we make a prototype for networked curation on a small scale which could be coupled with documentation explaining our choices so there’s could be a level of transparency and thoughtful reasoning behind media curation, at least where we care to set it up.

]]>