I’m following up Paul Hill’s recap of the FORCE11 Conference Keynote highlights with all of the “extra” goodness that came out of the conference concurrent and ignite sessions. If you haven’t had the opportunity to read through the previous post, stop what you’re doing and go there now.

Alright, now that you’re excited, you are fully prepped for our final FORCE11 insights.


A photo posted by Jamie Seger (@jamiemseger) on

altmetrics: non-traditional metrics proposed as an alternative to more traditional citation impact metrics in scholarly or scientific publishing

How often are we using altmetrics in Extension? I promise there are many states and individuals using social media metrics and other metrics from non-traditional forms of outreach, but are they reporting the right metrics? What does non-traditional “impact” look like?

Our view of what “impact” looks like mainly focuses on behavior change and possibly awareness. It needs to be broadened regardless of the method of outreach. Altmetrics expand this view of what impact looks like, but also of what’s making the impact.This matters because expressions of scholarship are becoming more diverse. Browsing through the Altmetrics.org Manifesto or the AHA Digital Scholarship Guidelines gives you a crash course in how the world of academia is beginning to embrace the evolution of scholarship.

We an use altmetrics to answer:

  • Who is using our information, tools, apps?
  • How are tools, apps, and info being discovered?
  • Are tools, apps, and info meeting the end user needs?
  • Is there a community growing around our tools, apps, and info?

EXPLORING THE MEANING OF ALTMETRICS

Stefanie Haustein

University of Montreal

Haustein provided an overview of what can count as altmetrics; and it includes the use of a variety of platforms and uses.

Some examples of altmetrics included a look at engagement levels on Twitter when scholarly information was shared in different ways. When bots shared the content (consisting of only the title of the an article plus a link) engagement was low. However, when snippets of content were shared from the article, and a conversation was initiated, engagement was very high.

It’s up to the scholar, and up to us as Extension professionals attempting to actually engage with our online audiences, to take that extra step of starting those critical conversations. Don’t be a bot – provide content and context. And start the conversation.

Scholarly Tools of the Trade to Experiment with Now:

  • https://www.mendeley.com; Mendeley is a free reference manager and academic social network. Make your own fully-searchable library in seconds, cite as you write, and read and annotate your PDFs on any device.
  • https://www.researchgate.net; ResearchGate spreads (via email) and uses a lot of scholarly content, but does not have a lot of engagement. Most engagement around scholarly content happens on Twitter but 80% of scholarly content never makes it to Twitter.

What is currently being measured with altmetrics in academia?

  • Societal impact
  • Scholarly impact
  • ”Buzz”

Are we measuring any of the above in Extension? I think not. We’r only in the beginning phases of tracking societal impact. But we have to start somewhere.


USING ALTMETRICS TO TRACK OPEN SCIENCE ACTIVITIES

Holly Bik

Center for Genomics & Systems Biology at New York University

hollybik.com

Bik argued:

  • Software developments and products are not understood or appreciated by most peer reviewers
  • Dissemination and use of software must be evaluated in different ways than traditional academic products

Bik has developed http://phinch.org; a kayak.com for massive genomic datasets. We know in Extension that it’s tough to beat the power of face-to-face connections, and Bik also knows this to be true. By going to conferences, the downloads and use of her software increased with every meeting she attended. She learned that serving as a salesman for her software was highly effective, simply because people needed a human to explain why they needed the software.

Another large genomic dataset tool mentioned during FORCE11 was https://opensnp.org. openSNP allows customers of direct-to-customer genetic tests to publish their test results, find others with similar genetic variations, learn more about their results by getting the latest primary literature on their variations, and help scientists find new associations. This project is crowdfunded by donations and works with Google summer interns to build its direct-to-consumer genetic testing service. They have built a community that cares enough to financially back it and keep it sustainable.

To track web impact (views, where visitors are coming from, and how they’re finding the software) Bik uses GitHub. But in her world, the interpretation of impact is very different from how we in Extension define “impact” (mostly website visits, click-through rate, and user engagement).

To broaden our definition of impact, how we acquire metrics, and which tools we’re using to do so, Extension can benefit from:

  • https://impactstory.org Impact Story can help track buzz on Twitter, blogs, news outlets and more. Like Google Scholar for your research’s online reach.

DEMONSTRATING IMPACT AS A PRACTITIONER-SCHOLAR

Heather Coates

IUPUI University Library Center for Digital Scholarship

Promotion and tenure is digital scholarship’s Berlin Wall. Coates agrees. Most of the amazing work we do is not addressed in P&T documents. These laborious and tedious reports only reflect a brief snippet of Extension and scholarly work.

In order to succeed with digital works, Coates suggested a strong support network of pre-tenure candidates has to exist, with mentors who have been through the process and successfully stood by their non-traditional methods of teaching, research, and outreach.

Another sure-fire way to succeed through P&T with non-traditional work? Impact and judgement of quality cannot not be left to the peer reviewers.Bottom line: it’s not just Extension professionals, faculty are also scared to present their altmetrics in their P&T documents. They know they will be compared to others who may only have traditional information and metrics in their documents.

So, who wins in a race to the bottom?


DIGITAL DISEASE DETECTION AND THE FUTURE OF PARTICIPATORY RESEARCH

John Brownstein

Chief Innovation Officer, Boston Children’s Hospital

Professor at Harvard Medical School

Co-Creator, HealthMap

@johnbrownstein

HealthMap of Zika cases in Ohio and the surrounding area.

HealthMap of Zika cases in Ohio and the surrounding area.

Need or want information in the spread or occurrence of disease in our country? John has a web or IoT (Internet of Things) tool for that. http://healthmap.org is a living breathing map of disease outbreaks. Created by Brownstein, who is studying ways of using big data streams to anticipate disease outbreak.


My biggest takeaway: that professional narcissism exists everywhere and that we have the power to overcome it.

FORCE11 was amazing to experience as someone who has attended dozens of Extension conferences and rarely gets to glimpse the world outside of our Extension bubble. Venturing outside of our comfort zone and attending non-Extension events provides so many opportunities to expand our resources, tools, and even support network. To know others are struggling with the same issues we are as an organization, and to know that there are people out there doing something about it, is invaluable. We need to do this more often. Much more often.