Originally when I was assigned the task to compare presentation softwares PowerPoint and Prezi for the purpose of determining which was most suitable for technical writing students at Lane Community College I was given three criteria, of which two were to be selected as focus points for evaluation. The criterion choices available were “Ease of Use,” “Effectiveness for Audience Comprehension,” and “Creative Options.” This initial choice would prove to be its own small test in disguise. From my perspective as a technical writing student and my experience working in businesses I felt that “Ease of Use” was the primary category to focus upon- I don’t have time to waste so any time saved in the learning process can be redirected to more important things, such as building and practicing the presentation. Another litmus test I used to re-contextualize this primary choice was simple; I asked myself which criteria I would go with if I could only choose one and simulating the outcomes of the other choices was a simple exercise. Focusing on creative options would be beneficial to advanced users or artists perhaps but how could these options matter to a pressured person who needs to first learn how to use the software? Furthering this thought experiment I asked myself: “Would effectiveness for audience comprehension be the best solo criterion?” The same prioritization used in the 1st case applies here as well: Does it matter if the software is easier for the audience to receive if you can’t make a presentation in the first place due the time-consuming learning? Working from this functional framework it seemed clear to me that ease of use would have to be the primary factor here. The question thereafter was, which of the additional two criteria would best support the primary? This choice was a lot closer to call. One one hand I felt like effectiveness for audience comprehension would be the best choice since to be successful in the future, the presentations given by technical students should be comprehensible to the receiver. Fair point. The reason I selected creative options over audience comprehension instead is due to sequencing. Using the same functional framework mindset I reasoned that the reception of the audience is based on both the skill of the presenter overall and the fact that the presenter would best be able to craft such a presentation if given a wide range of options. In essence I see the criteria feeding and building off each other in this order: Foundationally ease of use unlocks the ability to use the software, creative options allow the skill of using the software to be transformed into building high quality and/or advanced presentations, and audience comprehension allows that presentation to deliver its content meaningfully. Each of them is needed for a presentation to exist and be successfully delivered.
To locate sources I performed two dedicated research phases in which I cast a wide net using two search engines, the database at PLOS One, and the Google Scholar service. These phases were not to necessarily evaluate sources, only to amass information for review. To put this into perspective, my original batch of sources consisted of about 30 studies and articles; By the time of the finalization of my report this number had been cut down to twelve. The goal was to collect any information regarding three targeted information categories: sources comparing PowerPoint to Prezi, sources comparing the features of PowerPoint, Prezi, or both, and sources providing information on what college students desired from their presentation software. After these gathering phases a filtration phase was performed to target the most relevant data and use it to make a recommendation.
The filtration of data was a two priority level system. Information was first filtered by category; Scientific studies and academic work were tier one, large data sets and expansive works citing sources in tier one were in tier two, and tier three consisted of anything else. Each subset was then prioritized by recency. This organization shaped how the gathered data was valued and allowed me to quickly discount low-quality or irrelevant sources, leaving behind potentially useful sources. Most of the cuts were due to the material being too dated or commercialized, but a couple of cuts were made due to quality issues. One cut, for example, was an article from Udemy.com that specifically compared PowerPoint to Prezi in good detail; The article was well written and did a decent job explaining what the softwares do but seemed to stress the point that it didn’t matter which you chose since presenter skill is the determining factor in how well a presentation is received. While I agree with this notion personally, Udemy is a business that generates its income from selling lessons to users so there is a profitable incentive for Udemy to talk about these two softwares without being too conclusive, while pushing users towards purchasing a lesson for whichever software they appeal to. Since concluding my inquiry I reread the Udemy article’s findings for the softwares and found them to be generic and dated compared to other sources that ended up in the report.
As a result of gathering and organizing the information it became clear that there were a couple serious issues that had risen to the surface in regards to the inquiry that would need to be addressed in order for me to make a true recommendation. On a positive note I had also found a well done current study and a potential databank to draw information from. The first issue I noticed was that there seemed to be a lack of current serious studies done on presentation software. The second issue was with my choice of criteria. As I looked through articles and studies regarding the features of each software a pattern emerged- articles would often tout one software over the other based on a particular functionality or feature, only to be nullified by a more recent article talking about how the features were updated. This was a more serious issue as one of my criteria “Creative Options” was going to be a detailed feature comparison between the two contenders and this factor kept constantly evolving! This was no good for making a determination! BLAST!
For the first issue, low supply of current information, I decided that the most recent and well performed study would be used as a primary source to draw from with additional context coming from older works. This primary source is a 2021 study published by the State University of New York Institute of Technology “Prezi v. PowerPoint: finding the right tool for the job.”
Additionally, a software review website I discovered, G2.com, had thousands of confirmed user reviews (growing by the day) for PowerPoint and Prezi that were broken down into nine categories related to ease of use, many specific feature ratings, and data on the organization sizes the user review came from. This dataset of user reviews was much larger than any study’s test groups by a large margin so I placed the “Ease of Use” ratings for each software into excel sheets to build graphs for analysis. While G2.com is a software retailer, the reviews are user generated and require you to sign up and give detailed information to confirm your G2.com account so I felt like the commercial aspect of the site was disconnected from the data and was not likely to compromise it. More than anything else, this was the largest set of data available on PowerPoint or Prezi that I could locate and that was unique, exciting, and valuable, to the comparison in my eyes.
To deal with the second issue I was going to have to change gears and refocus on “Effectiveness for Audience Comprehension” over “Creative Options,” which struck a blow to my schedule more than anything else as I had to go over my sources again to see if there was any information in them that related to comprehension. Fortunately, my primary source study (chosen for its feature analysis and comparisons) also boasted a hearty section on a design philosophy called Human Centered Design (HCD) which provides standards for creating products as user friendly as possible (as opposed to task friendly). The HCD portion of the study examined how well an audience received information from PowerPoint, Prezi, and an oratory presentation. The data was broken into four HCD categories “coherence,” “engagement,” “inclusiveness,” and “malleability” and a post experiment survey was included with data on the number and ratings each software received. I had my pivot for my secondary criteria, a strong set of data for the first criteria, but it still felt like something was missing.
Unfortunately I couldn’t generate or find more data that was current so I turned to some of my other sources to compare them to the primary ones to give them context. While opinion articles weren’t valuable as primary sources, they were useful in “checking” the primary sources. The recent articles about Prezi as well as the dated ones seemed to confirm Prezi’s ease of use issues found in my primary study. However, an older study by Harvard (the first major study comparing PowerPoint and Prezi) seemed to contradict the findings in my primary source on the subject of effectiveness of audience use. This could be explained by many factors and required a closer look. Upon deeper investigation, it was found that differences between the software were very close; Additionally, annotations to the study were discovered that indicated a conflict of interest disclosure that had been originally left out when the study was published but added months later: Prezi was the funding source for the study. This alone didn’t discount the study but it also didn’t reinforce my confidence in it. I decided to include the Harvard study to provide additional context to balance the report findings. More interestingly and more importantly, the Harvard study contained a second note:
“As of the latest report Prezi still does not meet the accessibility standards of either the Americans with Disabilities Act or WCAG. Unless and until this is addressed other formats will be needed for viewers with sight and other disabilities. Doug Gray, communications specialist, Special Education Division, Minnesota Department of Education.”
This opened a new channel to examine in regards to making a recommendation. Looking into Prezi via the previously mentioned research tools, no indication was found that Prezi met the compliance standards of the Americans with Disabilities Act (ADA), and while there was one indication that Prezi was compatible with WCAG (Web Content Accessibility Guidelines) but it was the outdated WCAG 2.0 standard and not the current WCAG 3.0. Looking into PowerPoint revealed that not only was PowerPoint ADA compliant, the software also included a feature that checks the user’s presentation to ensure that the presentation itself is ADA compatible. This information was key in pushing the recommendation towards PowerPoint since I could not conceivably recommend a software to an educator that might be exclusive to impaired students.
The recent studies supported PowerPoint over Prezi conclusively in the category of ease of use. Effectiveness of audience comprehension was a much closer slice, while PowerPoint came out on top, it was razor thin and historically a toss-up.
Looking back on the research overall, I feel like my process was effective in delivering a timely recommendation given a limited pool of studies to work with. The gathering and data filtration methods I used saved me a ton of time while providing a large enough pool of resources to pull from. That being said I wish I had more time to put into fleshing out the data even more and delving further into the concepts behind human centered design and what the best potential presentation software might look like. If I were to do this project again I would give myself more time to define and contextualize the data further; Due to time pressure I ended up using the studies’ definitions for certain concepts, and while that’s fine I would have liked to spend more time examining them from a second (or third or fourth) reputable source. Secondly, you can never have enough information for an inquiry- I like to say it’s always better to have too much than not enough of a good thing and this is especially true for any research endeavor. There is such a thing as enough data but it is much easier to cut down a dataset to “enough” than to build one up later on when you realize you need more, which nearly happened to me in this case. Lastly, I would suggest that you write reports and revise them on different days if possible. Everytime I think a report is complete I give it a day and revise again, and every single time I find mistakes or better ways to formulate my work, regardless of how perfect I felt it was the day before.
For those interested in the report itself, you can view it by clicking here.
Thanks for reading,
-Vince
