Posts Tagged: publishing

Scholarly Communications Lab Projects

Like nearly everyone in the publishing industry these days, the publishing professors at SFU have plenty going on. For example, take Dr. Juan Pablo Alperin, who teaches PUB802: Technology & Evolving Forms of Publishing, is the Public Knowledge Project’s Associate Faculty Director of Research, is the recent recipient of the Open Scholarship Award from the Canadian Social Knowledge Institute, and on top of all of that is also the man behind many research projects at the Scholarly Communications Lab, which he co-directs.

While not all of them are directly related to the trade industry, almost everything they do is about scholarly publishing. We wanted to highlight some of the interesting things he and his team have been working on lately.

Cancer in the News
Alperin’s team is looking at news coverage of government-funded papers in biomedical research (specifically cancer) by analyzing how many times each study is mentioned in the news, and where. More specifically, they are looking at how the news is shared based on the 4 tiers of news coverage in both traditional and non-traditional outlets—a hierarchy that editors often use when determining the value of a story.

Open Source Altmetrics
They’re also working on building a tool for academic journals to have article-level metrics. Altmetrics are alternative ways of measuring scholarly impact, such as references in online news media and social media, as opposed to more traditional measures that identify things such as number of citations.

RPT (Review, Promotion, and Tenure) Project
Alperin and his team investigated the review, promotion, and tenure (RPT) process in the U.S. and Canada. Their goal was to deliver recommendations to universities and colleges that would encourage behavioural change towards a greater opening of access to research results.

They began by collecting RPT guidelines from over 100 institutions and assessed the degree to which they included Open Access (OA) recommendations.

“Despite countless policies and mandates promoting open access, as well as the development of tools and resources that facilitate it, and despite years of advocacy work, the majority of researchers are still not compelled to make their research outputs publicly available because the incentive structures that drive faculty’s research dissemination strategies remain unchanged,” says the team.

They found that only five (1%) of the RPT guidelines they studied explicitly mentioned OA, and in four of the five cases it was only “done to call attention to the potential problematic nature of these journals (which are seen as potentially of lower quality than subscription journals).”

The team is continuing on to Phase II of the project, where they will be studying faculty perceptions and beliefs regarding the RPT process, how RPT documents influence perceptions of the process, and the factors outside of RPT guidelines that influence how faculty disseminate their research.

Social Media Use by Researchers
In April, the team hosted a roundtable discussion about using social media to share science stories. Invited were: a YouTuber, an Instagram biologist, a traditional science journalist turned freelancer, and a journalist from Hakai Magazine (which specializes in citizen science).

Science Writers and Communicators of Canada
Similar to the roundtable discussion on how scientists are sharing science stories via social media, this project also looks at how science communicators are using untraditional methods to share their message (such as vlogging, instagramming, etc.).

As Science Writers and Communicators of Canada recently added “communicators” to their focus, Alperin’s team wanted to look at who and where these communicators are and how to best support them. They also wanted to look at how they differ from conventional science communicators in terms of standard ethics, accuracy, and practice; how they see themselves, and how they reach their audiences.

The findings will help identify the goals and challenges of science communication in Canada, and how to best support, train, and create outreach activities that will improve the quality of public engagement with science.

Diabetes Forums
The team combed the social web to identify public concerns about diabetes to direct academic research on the disease. This method of harnessing public engagement to directly impact research helps connect and involve the general public in academia, and vice versa.

Measuring Facebook Engagement
Many people share things over social media privately, such as through direct message or email. This sharing, known as dark social, currently cannot be accurately tracked. So the team looked at how altmetrics measure dark social, and found that there is a considerable amount of sharing done out of the public sphere that is captured by altmetrics.

And some of their work has been recently published in papers:

Zika and Language Use on Social Media
In this paper, they looked at how during the Zika virus outbreak there was an uptick in Zika research. Although the purpose of sharing research was to communicate with and inform the general public, the team used a language detection algorithm and “found that up to 90% of Twitter and 76% of Facebook posts are in English” despite English not being the first language of those at the centre of the epidemic.

Among other things, their paper says, “Our results suggest that Facebook is a more effective channel than Twitter, if communication is desired to be in the native language of the affected country.” They also explain that altmetrics favour English-language communication, large Western publications, and Twitter, meaning we need to build nationally relevant metrics in order to more accurately measure social impact.

Looking at Networks on Twitter
This paper looks at how primary research literature affects the public’s understanding and engagement with science; and how knowledge diffuses using social media. In their small case study, they found that Open Access articles shared tended to stay within small communities comprised of mainly researchers and did not generally reach the outside community.


The MPub Media/Tech Project

It’s been around a month now since the classwork portion of our Master of Publishing degree wrapped up, and now that I’ve had some time away from the intensiveness that was the last few weeks of school it seems like a good time to talk about the Media/Tech Project.

In the fall semester we devoted six weeks of our lives to starting fictitious publishing companies complete with a detailed list of books. But what to do in the second semester of a publishing degree?

In the spring, the program moved away from books to focus on media and technology (in the past, the program focused more heavily on magazines). As the publishing industry changes, it has become clear that in order to for publishers to remain relevant, they must understand how technology impacts all aspects of their business. It’s not enough to focus on print and traditional forms of publishing. We have to look ahead to what publishing could become. And so, our class became Media/Tech Project guinea pigs.

While we started off the semester working on the Media project and finished with the Tech project, for all intents and purposes they were the same thing—the second was simply an extension of the first, which meant the project ran the entire course of the semester.

On the second day of class after the holiday break, we were divided into our groups and told to form media companies based on direction we pulled out of a hat. One group was assigned B2B (they pivoted and become NFP2NFP instead), another group got arts and crafts, and the final group pulled politics. From there, the groups were tasked with building a media entity from the ground up.

How do you build a brand? How do you become financially viable? How do you grow sustainably? What gap in the market are you meeting? What will your product be?

In our groups, we began to answer these questions and sketch out our business plans. Nearly every week, groups met with instructors to pitch their updated businesses, which evolved as we completed more research and received more feedback. At the beginning of the project, it was stressed that our start-ups would need to be agile, and that became our mantra as the semester progressed and the work piled up.

And every week, we were given additional pieces to complete. Brand guidelines. Marketing and advertising plans. Financials. Websites. Podcasts. The list went on.

Halfway through the project we were divided into additional groups with specific skills (this is where the Tech project came in). The Web Development, Analytics, Media Production, and Ebook teams provided focused support to their media entities following a series of mini lectures aimed at providing them with hands-on skills. Of course, all students were invited to attend the other teams’ lessons.

And just like the fall book project, we made it through to the end of the semester, presenting our launch-ready companies to panels of industry guests. Some of the most rewarding feedback we received was that our final companies were even pitch-worthy to potential buyers. And some of the best presentations I’ve ever seen were on that final day as well: one group even “recorded” the beginning of a podcast as part of their presentation.

While the Media/Tech project will undoubtedly look very different by next spring as our field continues to evolve and the skills that are in demand change, what I hope future classes also take away from the project is the importance of being flexible and ability to find creative solutions.


Illustrious Alumni: Shirarose Wilensky & Paula Ayer Talk Mentorship

I’ve talked to a handful of Master of Publishing alumni lately, and somewhere in the conversation I always ask what advice these accomplished, successful women have for the next generation of publishing professionals. Their answers have been strikingly similar: work hard, accept opportunities, ask questions, and seek out mentorship.

And it’s that last point that I want to focus on today, both from the perspective of the mentees (Shirarose Wilensky from Arsenal Pulp Press and Paula Ayer from Greystone Books), and a mentor whose name has come up again and again (Nancy Flight from Greystone Books).

Ayer and Wilensky’s stories are similar in a lot of ways. They both completed the Master of Publishing program at SFU around 10 years ago; they have both done freelance work and have worked for independent publishers in Vancouver; and they are both local editors who have recently transitioned into roles with substantial responsibility.

Wilensky just took on the position of Editor at Arsenal Pulp Press after freelancing for the past few years; while Ayer became Editor at Greystone Books in the fall after spending nearly a decade working at Annick Press. As a current student in the MPub program, it’s been both reassuring and exciting to get a glimpse of where my career could also take me within the next decade when I talk to alumni.

“Take every opportunity that might be offered to you, talk to as many people as you can, go to events, and volunteer at the Writer’s Fest,” Wilensky suggests when I ask about nurturing your career path. Ayer lists all of the same things, and adds that you should also showcase your special skills.

And of course, they both speak to the value of mentorship, and cite the value of the connections they made in the MPub program.

“I can’t overstate the importance of mentorship. If there is a specific person you really admire, approach them,” Wilensky encourages, saying that a good way to find a mentor is to find someone who is doing what you’d love to be doing in the future. “Recognize and appreciate how important, valuable, and rare these relationships are.”

She highlights how mentors can share both professional and personal advice, and can give you those always important job recommendations. In return, she says, make sure show your appreciation for your mentor, who is likely very busy with their own career.

Ayer echoes her advice. “Use the connections you make—don’t be shy to send them an email, go to industry events, keep nurturing those relationships, and show people you can do good work.”

The editors are quick to highlight the mentors who have played significant roles in their careers. Ayer thanks long-term MPub instructor Mary Schendlinger from Geist Magazine and Colleen MacMillan from Annick Press who she says gave her opportunities, believed in her, and were brilliant teachers. Wilensky mentions Nancy Flight, also a past MPub instructor and current Editor Emerita at Greystone Books, whom many other alumni, Greystone employees, and MPub faculty have highlighted as being a VIP in the Vancouver publishing industry.

After hearing so many great things about Nancy Flight, I wanted to talk to her about the essential role she has played so many people’s careers over her own 45-year career in publishing (around 24 of those years were spent at Greystone).

“I love doing what I can to help foster their skills,” she explains, adding that it is always exciting to meet others who are passionate about publishing and show aptitude in the industry, pointing out the that the MPub program is ripe with talent. “And it’s wonderful to see people blossom, and see where they’ve gone with their careers.”

“It’s really important to me to encourage woman that they can do both [have a successful career and fulfilling home life],” Flight continues. “It’s wonderful to think that there are all of these young people who are more than ready to take on the challenges [in publishing].”

As a mentor, Flight notes the importance of mentees making their goals and interests known so that the mentor can tailor their advice to the individual relationship. However, she is quick to clarify that mentorships can be as formal or informal as you’d like—there is no one right way for the relationship to work.

Mentorships are a win-win for both parties: people like Wilensky receive great advice that helps them advance in their careers; while people like Flight are able to cultivate talent that they can later hire or recommend to another publisher.

Reflecting on the BC publishing industry, Flight confirms what we’ve been hearing from our many guest speakers all year: that the publishing community is very welcoming and friendly. “There is a feeling like we’re all in this together and we want to help each other.”

So don’t be shy, and reach out.


Reflections on the Emerging Leaders in Publishing Summit

We were halfway through the intensive Emerging Leaders in Publishing Summit before we realized that we, the Master of Publishing cohort, were the Emerging Leaders.

It was also around this time that our conversations with industry leaders, which took the form of keynote lectures, panel discussions, workshops, and one-on-one mentorship sessions, began to change. At the beginning of the week we talked data, marketing, trends, and growth. But as we began to talk diversity, inclusion, and responsibility, we discussed not just the problems in publishing, but what we can do to make a positive difference.

Discussions centered around how to create space for marginalized groups, the importance of mentorship and support, and ways in which we can make our industry more representative and balanced—both in terms of who works in the industry and what is published. These things matter so much.

“It was intense…it was daunting and overwhelming at times,” said MPub student Jesse Savage. “It was great to have everyone come out and hear everyone’s stories, and gain some perspectives and start conversations. I think after hearing everyone talk, I’m really interested and excited to see how things are going to change…it’s pretty clear that things have to change.”

Industry leaders from a variety of publishing backgrounds (including Simon & Schuster Canada, Penguin Random House Canada, Indigo Books and Music, Rakuten Kobo, Theytus Books, Orca Book Publishers, and a variety of smaller publishing houses), along with academics and authors, also noted the impact the week of listening, discussing, and learning had on them. The deeper conversations have inspired MPub students, external participants, and professionals alike to get back to their important work with a renewed sense of fidelity and responsibility.

As Digital Broadcaster Ryan McMahon said, “We’ve made this connection, and now we’re all going to continue to work together on this conversation, and that’s a really amazing offer by everyone who participated.” McMahon also gave a special public talk on the Wednesday evening, where he problematized Canada’s recent race to Indigenize everything, and challenged people to really think about how thoughtless actions and platitudes will only further harm Indigenous Peoples. He also talked about how we need to be aware of who is in spaces—and who is missing; why the conversation about colonization needs to happen before we talk Indigenization; and why building relationships needs to be at the centre of all we do if change is going to happen.

Much of what he and other guest faculty shared led to the MPub cohort looking at publishing with fresh eyes. We leave with the language to have these hard conversations, a better understanding of what needs to change, and ideas on how we personally can affect change. I hope that moving forward from this week we will continue to not be afraid to ask hard questions, push for better representation in the industry no matter our positions, and break down barriers within the publishing industry.

As promised, the week was one of transformative change and learning.

Faculty guests included: Dave Anderson (Rakuten Kobo), Kristin Cochrane (Penguin Random House), Gregory Younging (Theytus Books), Hazel Millar (Book*hug), Will Ferguson (award-winning author), Noah Genner (BookNet Canada), Kevin Hanson (Simon & Schuster Canada), Robyn Harding(bestselling author), Rania Husseini (Indigo), Jónı́na Kirton (Indigenous author), Ruth Linka (Orca Book Publishers), Janice Lynn Mather (Bahamian author), Nita Pronovost (Simon & Schuster Canada), Felicia Quon (Simon & Schuster Canada).

Next year Emerging Leaders in Publishing will be held February 4-8, 2019 and is open to everyone interested in learning more about the publishing industry in Canada.


How to arrange a professional placement

In addition to coursework and a final project report, the Master of Publishing Program also includes one four-month professional placement, which can be completed anywhere.

Students take the lead in arranging their own professional placement (with the support of the faculty and the industry), with the process beginning as the first semester of school comes to a close. In January and February students begin to finalize the details, and by April most students have their placements arranged. The placements typically run May to August (around 12 weeks). Students enter their placements at a higher level than traditional interns, and have more input in how the placement will work. For example, students are encouraged to brainstorm challenges in a particular area of publishing they are interested in and then present solution-based proposals.

Professional placements are arranged in consultation with the faculty in the Department of Publishing, who help students determine what their goals and aims are and then suggest professional placements that may be a good fit or industry professionals they should connect with.

So what steps do you take to find a placement?

  1. Determine your interests. What type of publishing are you drawn to? The list of areas to explore is very long—starting with book publishing in the first semester and ending with magazine publishing in the second semester. Be open to plans changing and to new ideas coming your way.
  2. Connect with guest lecturers. Introduce yourself to them after class, send them a thank you email or tweet, or invite them out for coffee. This is the time to grow your network and connect with many people who will support you throughout your career.
  3. Research different publishers. Check out their websites, go to their events, and and become familiar with the types of books they publish.
  4. Set up informational interviews with publishers that pique your interests. An informational interview is very similar to a regular job interview, except you are the one asking the questions. Call or email publishers you are interested in doing your professional placement with and ask if you could arrange an informational interview to help you get to know more about the company because you are interested in working for them.

You can ask things like:

  • What kind of work do you usually have students do?
  • Are there any interesting projects going on that I would be able to be a part of?
  • What kind of instruction would I receive here?
  • How many students do you usually have at once?
  • What is the culture of the workplace like?
  • Why do you like about working here? Is there anything you don’t like?
  • What are you able to offer in terms of compensation?
  • Are there opportunities for employment following my placement?
  • Is there anything else you think is important for me to know?

Make sure to follow up the interview with a personalized thank you email or card.

  1. Watch the Quill & Quire job board and follow SFU Publishing on Twitter and Facebook for professional placement postings. Some placements are competitive and you will need to apply for them as you would a regular job. Other placements are arranged more casually, but you will still need to send your placement your resume for them to have on file.
  2. Update your resume and cover letter. SFU has Career Education Specialists available at each campus to help one-on-one with resume and cover letter writing, mock interviews, networking strategies, and more.  

Remember that it is going to be okay. Everyone finds a placement and that faculty are here to support you throughout the process.


Playful Generative Art: Computer-Mediated Creativity and Ephemeral Expressions

WEDNESDAY, February 8, 2017
7:00 pm – 8:30 pm
Room 1800 (SFU Harbour Centre)
Fee: Free (to reserve a seat, please email pubworks@sfu.ca)

“Generative art” is a blanket term for any creative work produced in part through programmatic or algorithmic means. “Playful generative art” makes use of highly technical disciplines—computer programming, statistics, graphic design, and artificial intelligence—to produce chat bots, digital poetry, visual art, and even computer-generated “novels.” These pieces may be motivated by serious social or political issues, but the expressions are decidedly unserious, often short-lived or quickly composed. Creators working in this medium are rarely artists first—as programmers, designers, game developers, and linguists, they use the tools of their trade in unexpected and delightful ways. Generative art also has much to teach us about issues at the intersection of ethics and technology: what is the role of the artist in a human/machine collaboration; what is our responsibility when we design programs that talk with real people; how do we curate and study ephemeral digital works? Digital artists, writers, technologists, and anyone interested in media studies are invited to attend.

Guest Speaker:


lizadalyLiza Daly
is a software engineer and occasional corporate executive who lives in Boston. She is currently focusing on providing technical assistance to non-profits that work to uphold civil rights and protect vulnerable populations. Her personal projects revolve around digital art, interactive narrative, and digital publishing. Formerly she was CTO at Safari and prior to that, founded a digital publishing company called Threepress, which Safari acquired. Her new company is World Writable. She has been quoted about “Digital Detox” and the effects of the iPad on reading (NYT, 2010), ebooks in the cloud (Wired, 2011), and on strategies to help introverts network (FastCompany, 2015). Liza has presented about great engineering teams and digital publishing. She wrote a short book on Next-Generation Web Frameworks in Python (O’Reilly, 2007), which, she says, is “out of date so please don’t read it”.


Trying not to drop breadcrumbs in Amazon’s store

"Your margin is my opportunity" ~ Amazon CEO Jeff Bezos

Last week while on a day trip to Seattle, I decided to make a stop at Amazon’s first[1] “Brick and Mortar” store, in the University of Washington neighbourhood of University Village.

I had two goals in mind. I wanted to see if I could get a personal book recommendation from an employee and I wanted to purchase a book without leaving any data. I am not averse to providing personal information, but I like to have the choice.

gettyimages-495512148
David Ryder—© 2015 Bloomberg Finance LP

Now the company was moving into the physical space. How consumer-centric would it be?

As I wandered into Amazon’s first physical book store, I was struck by nostalgia for the small independent bookstores. Amazon has gone full circle. The small independent bookstores were decimated by massive, multi-city block stores such Chapters in Canada and Barnes and Noble in the US. In turn, Amazon’s entry into the online bookselling business in the late 1990s had, by the early 2000s, badly bruised the competition. Books began to take a back seat to lifestyle merchandize, kitschy cards, stationary and stuffies. Amazon grew bigger, squeezed publishers on pricing, and further backed the physical bookstores into a corner. Books were longer realizing a sustainable margin for the big chains. By 2010, Amazon dominated book selling through online sales of print books, and leading the way in eBook sales and its proprietary electronic reader – the Kindle.

IMG_5646 (1)
Photograph: Suzanne Norman

The Seattle storefront reminded me of a modest neighbourhood book store in a great community setting. There was a cupcake shop and kids toy store nearby, and kids were playing on an outside playground. People were coming out of Amazon chatting and smiling, heading off to get a coffee and talk books.

An employee greeted us inside the store. She held an electronic gadget in her hand and smiled warmly as she beckoned toward the inside of the 7400 square foot store. In the very centre of the room, sat a huge flat screen TV. I am not sure if it was 4K, but the three kids sitting in front of it sure looked happy. They were playing the iOS-born Crossy Road using Amazon Fire TV.

 

Next to the TV was a table bedazzled with Kindles of all makes and models. A salesperson was talking to a customer about Amazon’s Digital Assistant Alexa (allegedly named in tribute to the Library of Alexandria). “Alexa” is actually the wake-up word used to activate Amazon’s Echo[2], a voice command device for the “smart home”, which answers questions, reads audio books, orders pizza, and becomes increasingly better at offering suggestions and choices the more data “she” has to analyse.

The more questions you ask of Alexa and the more you interact with it, the more it can “help make your life easier”, assures the salesperson. The customer is clearly unsure as to how all this works and suggests that it might make too much personal information freely available.

The salesperson quietly tells him that the information collected by Alexa is used to enhance the customer experience; to make shopping easier. Developed for the voice-activated “smart home ecosystem”, Alexa also personalizes search results for pretty much anything: books to vacations and, of course, helps the user order or restock items through Amazon.com.

Not having access to Alexa in Canada, I was completely enthralled, but I was getting data-saturated.

I refocused on books and decided to ask the nice employee who greeted us at the door if she might be able to help me find a book. She whipped out her device and asked me what title I wanted. I said I was looking for a recommendation of some titles. I told her I wanted to buy something for a friend who was interested in sports writing, more specifically, newspaper sports writers’ work outside of their journalism work.

She frowned and said she was sorry. She could only search by title and that she did “not know all the books in the store.” She suggested I try the Sports, Entertainment, Biography and Reference sections. She assured me that all the books in the store were 4 or 4.5 stars. I asked what that referred to and she said all the books in the store were curated according to the reviews at Amazon.com.

IMG_5655 (1)
Photograph: Suzanne Norman

 

I walked to the Sports section and browsed some titles. I found an anthology of sports writing. It looked good. I checked the price on the back – $14.95. Knowing that Amazon would use the online price, I walked back to the employee and asked what the barcode on the cards attached the shelf was for, pretty certain it was a way to get the online price. She told me it was for internal inventory control.

On my way back to the shelf I ran into another employee. I showed him my book and asked if I could find the online price. He led me to a price checker. And there I discovered the book was listed at $11.99. Perfect, I thought. I did not have to give any personal data and I will get a nice discount. As I turned to leave, the employee suggested I also download the Amazon app. All I had to do was click the tiny camera icon on the app I could scan any price by using the bar code on the shelf to get the lowest price.

I said I was told the barcode was for internal inventory control. He looked baffled and told me they use the code to allow customer access to the most up-to-date online discounts. He noted that the online prices were always fluctuating “for various reasons” and that instead of changing the physical cards on the shelf, they just do it electronically.  Makes sense.

I download the app and head to the children’s section. Already I knew by even using Amazon’s WiFi on my phone my data footprint was becoming visible, but I was not planning to log in to the app, just use it to check prices.

I scanned one children’s book and I got the prize in Canadian dollars – more than the American cover price. Not good, but not too surprising given the current exchange rate.

I scanned another title. This time, I was instructed to log in. I would just go buy my books. I did not HAVE to log in anyway, I could always use the scanner, but if I did log in would I get a better price based on my own shopping habits? Would I get a “personalized” price? I did not see how they could be that customized (yet) since the sales person was scanning the book itself. And as I did not log in to the app, I have no idea what steps I would have been taken through on my phone.

Ha, I thought. Minimal data breadcrumbs and I have two books, an enjoyable shopping experience (for the most part – still wondering about the internal inventory control comment – and I have US cash so I don’t have to worry about the weak Canadian dollar in the purchase.

“You have saved 35%!”  the beautifully-dressed salesperson told me. Great, I think, and hand her my two American twenty-dollar bills. Her brow wrinkles and she says, “I am sorry. We are a cashless store.”

What?? I have no US credit card on me, and using my Canadian Visa is going to wipe out my discount and probably even add to the list price. I ask why in the world Amazon doesn’t take cash.

She smiles and says, “We want to replicate the online experience as much as possible.”

“But, you’re a physical store with physical books and I have physical money.”

She was sorry, she said.

I handed her my credit card. I could almost feel my data downloading into the Amazon vortex as my card slid through the machine. So much for avoiding the data trap…

“Have a wonderful day!” I heard from behind me as I headed out the door and thought about the rumoured expansion of Amazon’s physical stores into everything, not just books…

IMG_5659 (1)
Photograph: Suzanne Norman

 

[1] The location opened in the fall of 2015, with plans for a second one for San Diego, and according to some industry experts, as part of rollout of hundreds more.

[2] (Just last week Amazon released two new “siblings” for Alexa: the Echo Dot and Tap).

 


Website Usability, or How to Communicate in Less than 7 Seconds

By Lauren Cheal

Design a usable website. This is undoubtedly a lofty goal, but one that is increasingly crucial to business success in publishing. Web usability is really a form of mind reading. First ask: what do users want and need to do on a website, and then follow those answers toward designing content that presents information in a way that guides users to the appropriate end goals. The International Organization for Standardization (1998) defines usability under the following metrics:

Efficiency: the level of resource consumed in performing tasks

Effectiveness: the ability of users to complete tasks using the technology and the quality of output of those tasks

Satisfaction: users’ subjective satisfaction with using the technology

These focus areas provide a simple place to start when evaluating any website. If a business goal of a certain company is to have a visitor sign up for an email newsletter, the web design must address the process the user undertakes to do this.

1. Is it efficient: does it exclude unnecessary steps like entering a phone number or other irrelevant information?

2. Is it effective: once they complete the online form, have they actually been signed up for a newsletter they want to receive?

3. Are they satisfied: does the user feel like they accomplished a task?

Asking these questions about efficiency, effectiveness, and satisfaction of experiences is the first approach to usability. Let’s take a look at each of these factors in more detail.

Efficiency: Better Make it Quick

Soothsayers and divining rods were once used to understand the world and human behaviour, but thankfully, modern science provides other more reliable solutions. Neuroscientists have come up with different ways of actually reading the human mind. The most common mind reading device in the field of web usability research is eye-tracking, which involves a camera following the eye as it moves around a display (science 1, soothsaying 0). Sirjana Nahal (2011) measured first impressions of websites using one of these eye tracking programs, and Nahal reported the following conclusions. The first is that users spent less time on websites deemed “unfavorable” (Table 4.8). Perhaps this is not a shocking revelation, but it underlines an important principle of web design and usability. People know what they are looking for, and if a website does not offer it, they will go elsewhere (and quickly, no more than the time it takes to hit the back button). The conclusion that users spend less time on “unfavorable” sites also reinforces the importance of connecting people with the content they are looking for. This idea will be further explored in the following section on effectiveness in web designused commercial water slides for sale.

Nahal also looked at the ways users prefer to view web content, and these preferences are broken down by design categories. The following table outlines those conclusions.

User Preferences for Visual Style of Websites

The information in the above table is in keeping with basic principles of design that apply to print materials like magazines, newspapers, and others. Where those print technologies have traditionally had barriers to access that require a relatively sophisticated knowledge of print production to make a viable product, the online world is a democratized, open-source environment that encourages access for all. This generalization is certainly debatable, but at the same time, how useable a website is can be directly correlated with how much attention is paid to these principles of design.

Another resource from Dahal (2011) is an assessment of how much time users fixated on different areas of a simple website during the visit. That information is summarized in the table below.

Top Areas of Interest in Website Viewing

There are two things worth paying particular attention to in this information. The first is just how little time is spent on any one element of the webpage. 6.48 seconds is the most time a website can expect to hold the attention of an average visitor. 6.48 seconds. Given this minuscule window, it is crucial that websites are built with absolute efficiency in mind.

Effectiveness: Help Me Help You

Steve Krug offers a very succinct guide to best practices for web design in his 2006 book, Don’t Make Me Think: A Common Sense Approach to Web Usability (2nd Edition). The underlying argument Krug makes is that users will not do things on a website that take extra mental effort. Krug offers that users like obvious, mindless choices. A big part of what makes some choices more obvious than others is how they are labeled, and how the navigation of the site is laid out. Krug (2006) argues that the lack of physicality on the internet makes a webpage’s navigation system absolutely crucial to a user’s experience.

Website navigation should:

Help us find whatever it is we’re looking for

Tell us where we are

Give us something to hold on to

Tell us what is here

Tell us how to use the site

Give us confidence in the people who build the site

(adapted from Krug, 2006, p. 59-60)

With so many important tasks placed on the shoulders of navigation, a great amount of attention should be paid to how the elements of navigation (menus, sections, and utilities as a start) are designed and communicated. The application of conventions that communicate physical space and direct user actions is a major factor in how effective a website is from a usability standpoint.

Krug’s model also suggests that users scan websites instead of reading them. He compares them to the billboards we pass on the highway at 100km per hour. If it the information on the site can’t be read at that high speed, it is not an effective communication tool. One way to achieve quick and effective readability is to reduce the number of words on the page to focus user attention on exactly what you want them to do.

Krug describes how users interact with instructions on webpages: “The main thing you need to know about instructions is that no one is going to read them—at least not until after repeated attempts at ‘muddling through’ have failed. And even then, if the instructions are wordy, the odds of users finding the information they need is pretty low” (Krug, 2006, 42). Anyone who has tried to sift through an online help or FAQ page (here is an example of a wordy instruction page from the SFU Library) knows that this is absolutely true. It is a lightning-fast scan of the material, a quick attempt to click around and see if you can intuit your way out of your particular issue, and then a jump back to the help page for another nugget of information to try. Krug’s emphasis on the speed in which users can access the information they need mirrors the findings of Dahal, and many other usability experts and researchers. Milliseconds will dictate whether or not a person is going to use a website to do a task.

Satisfaction: Ahh. That’s the Stuff

The subtitle of Seth Godin’s 1999 book Permission Marketing: Turning Strangers Into Friends And Friends Into Customers has become almost cliche in the internet marketing canon. The principles laid out in Godin’s book still hold, and point to a fundamental shift in marketing that came about because of how the internet changed how we talk to each other. Godin argues that in order to make a sale online, a company must ask permission using accepted web practices. If a business is serious about making an impact on their bottom line through a website (this impact is not restricted to sales of goods, and should be thought of any way that an online presence can enhance customer experience), serious attention to design and web usability is a good place to start. Providing a satisfying customer experience is about more than just giving them the product they want. Now, more than ever, it is about getting people involved in a community (that lives online, primarily), asking them to participate in the community, and having them help build a brand reputation on behalf of the business. This ability to engage in and with a community should absolutely be considered when designing a usable website.

Research into the factors that contribute to user satisfaction on websites helps point the way toward what a business should do to keep their customers. Kincl and Strach (2012) studied user satisfaction on 44 different educational institutions’ information-based websites by documenting satisfaction levels before and after the use of the websites. The researchers found that content and navigation were key areas in determining overall satisfaction, and that “users perceive high-quality websites if they achieve what they visited the site for. This success in user activities is subconsciously reflected in website assessment” (Kincl and Strach, 2012, p. 654). In short, people are satisfied when the website they visit does what they expect it to do. A simple sentiment that is anything but simple to implement. Another interesting finding from this study is the fact that users care less about what the researchers term “trivial” data like the colour of the site than “non-trivial” data like the content (Krug and Strach, 2012). That is to say, an average user would still rate their satisfaction of an unpleasantly-coloured site highly if they found the information they needed. This serves as a reminder that while attention to the look of a website is certainly important, in the end, users want substantive content (and to be able to find it).

The 3 Most Important Things to Remember about Usability

If you were scanning this article like a billboard, this is where your eyes should stop scanning and start reading.

1. Your website needs to communicate really, really quickly. In under 7 seconds.

2. Your website needs to be easy to use. It should be obvious where a user should focus, and then what action they should take at each step (and there shouldn’t be many steps).

3. Your website needs to give a user exactly what they think they need. A website is a promise, and it is up to you to define that promise and then to deliver on it.


Bibliography

“Website Usability.” 2008.American Libraries 39 (10): 32.

Dahal, Sirjana. 2011. “Eyes Don’t Lie: Understanding Users’ First Impressions on Website Design using Eye Tracking.” Master of Science, Missouri University of Science and Technology.

Garrett, Sandra K., Diana B. Horn, and Barrett S. Caldwell. 2004. “Modeling User Satisfaction, Frustration, and User Goal Website Compatibility.” Human Factors and Ergonomics Society Annual Meeting Proceedings 48 (13): 1508-1508.

Godin, Seth. 1999. Permission Marketing: Turning Strangers into Friends, and Friends into Customers. New York: Simon & Schuster.

Green, DT and JM Pearson. 2011. “Integrating Website Usability with the Electronic Commerce Acceptance Model.” BEHAVIOUR & INFORMATION TECHNOLOGY 30 (2): 181-199. doi:10.1080/01449291003793785.

International Organization for Standardization (ISO). 1998. Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs), Part 11: Guidance of Usability. Geneva, Switzerland.

Kincl, Tomas and Pavel Strach. 2012. “Measuring Website Quality: Asymmetric Effect of User Satisfaction.” Behaviour & Information Technology 31 (7): 647-657. doi:10.1080/0144929X.2010.526150.

Krug, Steve. 2006. Don’t make Me Think: A Common Sense Approach to Web Usability. Berkeley, Calif: New Riders.

Mathews, Brian. 2009. “Web Design Matters.” Library Journal 134 (3): 24-25.

Morris, Terry (Terry A. ). 2012. Basics of Web Design: HTML, XHTML & CSS3. Boston: Addison-Wesley.

Snider, Jean and Florence Martin. 2012. “Evaluating Web Usability.” Performance Improvement 51 (3): 30-40. doi:10.1002/pfi.21252.

Somaly Kim Wu and Donna Lanclos. 2011. “Re-Imagining the Users’ Experience.” Reference Services Review 39 (3): 369-389. doi:10.1108/00907321111161386.

Other things to consider and discuss:

There are many institutions that attempt to categorize the “Top Websites” in the world at any given time, and Alexa is one of them. In addition to statistics on the most visited pages, Alexa provides information on category-specific website usage. For 2012, the top websites under the category “Publishing” were:

EzineArticles.com

Merriam-Webster

ArticlesBase

Audible.com

John Wiley and Sons, Inc.

Springer

GoArticles.com

Wiley Online Library

McGraw-Hill

OverDrive

See the full list here.

This top ten list show varying degrees of attention to design and usability. The standouts are the two Wiley websites, which both have a clean look and a clear path for users to follow, and the Audible website, which does works well to present a product, give key information about the product, and direct the user to an action (to “Get Started” using the product). Audible is a subsidiary of another company that is very good at directing user flows in a publishing environment (Amazon, of course).

20 Top Web Design and Development Trends for 2013 – By Craig Grannell of netmagazine

AWWWARDS – Recognition & Prestige for Web Designers (a good place to look for inspiration and ideas).


Demystifying Algorithms: Understanding the Math that Defines Your Life

By Lauren Cheal

Keeping in mind that we are publishers, and not math people, what is an algorithm?

At first glance, the algorithm sounds like a concept out of a particularly frightening chapter of a calculus textbook, but there is no reason to fear the concept. In Kevin Slavin’s TED Talk, “How Algorithms Shape Our World”, he defines algorithms as “basically, the math that computers use to decide stuff”. This simple definition is an easy way to think about the algorithm, but what “stuff” are computers using to make the decisions?

Wikipedia summarizes algorithms as the following:

An algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and initial input (perhaps empty),the instructions describe a computation that, when executed, will proceed through a finite number of well-defined successive states, eventually producing “output”and terminating at a final ending state.

To make this a little easier to understand, think of an algorithm as a program that is capable of going through a huge pile of information and making sense of it. The logical output that comes from this process is defined by the user at the beginning, according to the things they need it to do, and the order and way in which it is asked to do those things.

Jeff Hunter (2011) provides this helpful list of what commonly used algorithms do:

  • Searching for a particular data item (or record).
  • Sorting the data. There are many ways to sort data. (Simple sorting, Advanced sorting)
  • Iterating through all the items in a data structure. (Visiting each item in turn so as to display it or perform some other action on these items)

Understanding the basic idea that an algorithm is a function that allows for categorization of large amounts of information, it is easy to see where an algorithm has value in the digital world. Computer information is no more or less than a big pile of data, and algorithms give us shortcuts to processing these enormous data sets.

A basic type of algorithm is a sorting algorithm, which can take a set of data (let’s say, 100 books of different sizes), and sort those books in a specific way (according to their weight, for example). The Computer Science Unplugged website for children gives a good example of how a couple different algorithms could work for a weight sorting task, time- or space-complex task. There are many other types of algorithms commonly used daily, such as the search algorithms that display Google results, and each in their own way is display results or classifications based on how the algorithm works with the data.
Ok, so besides basic sorting tasks, what can algorithms really do?

Searching, Prioritizing, and Providing Biased Content:

Very complex algorithms can accomplish an endlessly diverse number of tasks. The Google search function is one example that we are familiar with, and now that you know what a basic algorithm does, this makes sense. Google search is sorting through an enormous amount of information, and using various functions to come up with the “best” end product—the exact website you were looking for. Seomoz.org has a an entire section devoted to understanding the Google search algorithm, including an interesting timeline that tracks all of the changes to that algorithm beginning in 2000 and cataloguing each year’s changes up to the present. While some of the language on that site is technically advanced (and intimidating), it is interesting to see the huge number of changes every year (Seomoz estimates that Google changes its algorithm up to 500-600 times a year), and to guess at what those changes mean for how we receive content.

Another implication of algorithm technology that is related to Google searching is found on Facebook. There, algorithms determine what content appears on a News Feed based on what content the user has previously “liked”. Eli Pariser, president of MoveOn.org talks about the danger of this kind of filtering in another TED Talk that is also embedded below. Pariser argues that people have to be careful about letting algorithms decide what news they see based on their likes, because a healthy news diet consists of both the things they instinctively like (chosen with the gut), and the things that could enrich an understanding of the world by pushing people to discover things outside of a current sphere of knowledge. Pariser goes further to say that algorithms are taking the place of traditional news editors (who were human, of course). Where the human editor acts as a gatekeeper and guide to information based on what they know about the audience and what they think the audience needs to know about, the (current, as of his talk) algorithm is only making judgments based on the most superficial, instinctual, and hedonistic of our online habits.

Assessing physical space, stealing jobs from humans:

Algorithms are also used for measuring and accessing physical space by robotic machines. The Roomba vacuum cleaner is able to clean a room because of an algorithm that works out the dimensions of the room and then sends it to each part of the room, systematically. The 60 Minutes feature that is embedded at the end of this paper gives another example of this kind of algorithm. There, robots are programmed to pick up warehouse shelves and bring them to workers at the moment they need to access the materials to pack them. These two examples show that algorithms are capable of computing physical space, and then making an assessment of a complex set of data to make a certain “choice” about a desired outcome.

Algorithms in conflict: crashes, glitches. What happens when this stuff breaks down?

The most frightening thing about algorithms concerns their volatility, and their ability to “speak” to one and other and inform the decisions that another algorithm makes. In “How Algorithms Shape Our World”, Kevin Slavin talks about the potential harm that can be done when these algorithms work outside of human control when he references the Crash of 2:45, or the “Flash Crash” that happened on the U.S. stock market on May 6, 2010. The “black box trading” that algorithms execute, in conjunction with “High Frequency Trading” contributed to the second largest point swing, and the largest point decline, on the Dow Jones Industrial Average in history (Lauricella and McKay, 2010). On how and why this happened, Slavin (2011) explains that in these algorithms, “We’re writing things that we can no longer read. We’ve rendered something illegible”. The term “black box trading” highlights the fact that some of the code works behind a wall of understanding that even the people who wrote the original formula no longer have access to.

When it comes to algorithms that build on each other and change outside of human oversight, the cause for concern is slightly larger. There are many art forms that talk about this potential problem. A lot science fiction literature and pop culture deals with this fear we have about creating things that outpace humanity and then turn on it. Think of Isaac Asimov’s I, Robot, Cory Doctorow’s Down and Out in the Magic Kingdom, The Matrix, and Battlestar Galactica. The fact that these messages pervade popular culture (and have done so for hundreds of years) speaks to an understanding that humans might just be too smart for their own good. And while algorithms represent a truly fascinating and powerful tool, extreme caution is wise when implementing automatic systems, particularly when those systems control finances, social lives, access to information, or any other frightening prospect.

A basic understanding of what an algorithm does, of what it has the potential to do, and of who is controlling the technologies that rely on these equations to make decisions about our lives gives us the power to ask critical questions and make sure that humans are in control of the technologies created.

At least, that is the hope.


There are a number of fun web videos on the topic of Algorithms. Please enjoy those below, and feel free to share others you know about in the comments section here.

The TED Talk that started it all: How Algorithms Shape our World by Kevin Slavin.

http://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html

60 Minutes Feature: “Are Robots Hurting Job Growth?”. Alarmist title aside, this is a video that shows some very cool uses of algorithms in the manufacturing and production sectors in the United States.

Eli Pariser’s TED Talk: Beware Online “Filter Bubbles”

Another TED Talk, this time about the algorithmic editing of the web and how that editing function affects the content we see online, and thus the reality of the internet we experience. Because these algorithms are now our editors, Pariser argues that we need to make the algorithms focus on a balanced news diet, including some junk food and some of vegetables.

http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html

At PopTech 2012 Jer Thorp gave a presentation on Big Data. This is a visually gorgeous look at different types of data being displayed in very interesting ways.

Thorpe looks at how that data trails (think of yourself as a data slug that leaves behind a trace of everything you do in an electronic form) we leave can be examined, visualized, and ultimately understood. He also breaks down the “architecture of discussion” by mapping Twitter conversations that happen around a New York Times article.

He also warns that data is the new oil, and that the fragmented microorganisms that compose oil are not dissimilar to the fragmented pieces of our souls that make up public data.

http://poptech.org/blog/will_art_help_us_harness_big_data_better_than_we_handled_big_oil

And finally, from The Onion: Are We Giving The Robots That Run Our Society Too Much Power? This is just one of my favourite robot-related videos of all time. My apologies, it doesn’t have embedding code, but it is worth clicking the link.

http://www.theonion.com/video/in-the-know-are-we-giving-the-robots-that-run-our,14200/


Sources Consulted

Danielsson, Lorenzo, E. “Understanding Algorithms for Novice Programmers,” Keeping it Small and Simple. May 20, 2007.

http://lorenzod8n.wordpress.com/2007/05/20/understanding-algorithms-for-novice-programmers/ Retrieved January 13, 2013.

“Google Algorithm Change History”. http://www.seomoz.org/google-algorithm-change Retrieved January 17, 2013.

Green, Scott A., Mark Billinghurst, XiaoQi Chen, and G. J. Chase. “Human-robot collaboration: A literature review and augmented reality approach in design.” (2008).

Hunter, Jeff. “Introduction to Data Structures and Algorithms.” December 28, 2011. http://www.idevelopment.info/data/Programming/data_structures/overview/Data_Structures_Algorithms_Introduction.shtml Retrieved January 13, 2013.

Lauricella, Tom, and McKay, Peter A. “Dow Takes a Harrowing 1,010.14-Point Trip,” Online Wall Street Journal, May 7, 2010. Retrieved January 15, 2013.

Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York). (2011).

Shirky, Clay. A Speculative Post on the Idea of Algorithmic Authority. November 15, 2009. http://www.shirky.com/weblog/2009/11/a-speculative-post-on-the-idea-of-algorithmic-authority/ Retrieved January 16, 2013.

Tan, Pan-Ning, Steinbach, Michael, and Kumar, Vipin. Introduction to Data Mining, Addison-Wesley (2006).