Digitizing for Preservation vs Access

Something I’ve thought a lot about ever since my intro classes last year is the issue of digitizing for preservation vs digitizing for access – digitizing with individual care to replicate resources in the highest possible quality, or digitizing quickly to make sure that lower quality versions of resources are available to the public as soon as possible. It’s a complicated question, and the answer varies depending on the purpose of the collection; archivists digitized the Mona Lisa for preservation, placing emphasis on the best replication of colors and marks, but they might digitize written documents from Da Vinci for researcher access, as their value comes primarily from their contents, not their artistic execution. A low-quality digital double is not an archival-quality facsimile, just like a black and white scan on a copier is a poor replication of the pages of the original book, and while some grainy 8.5×11 copies serve their purpose just fine, a book with color illustrations would lose much of its nuance and value.

The preservation vs access debate is ongoing in digital archive circles. In their article Digitization as a Preservation Strategy, Krystyna Matusiak and Tamara Johnson explain that cultural heritage digitization started purely as a strategy for access, a way of creating easily distributable copies of original resources, and that the movement to include digitization in resource preservation plans is relatively new and controversial. According to their assessment, however, digitization for preservation is becoming more and more popular, particularly for endangered materials like photograph negatives and audio recordings. Instinctively, I’m inclined to agree without reservations; I’m a millennial digital hoarder who likes to buy their movies in 1080p, and lower-quality digitization pains me. It could be because I grew up in the age of the internet, where storage seems endless and everything lasts forever (especially when you don’t want it to), but I err on the side of new digitization thought – that digitization can be part of a long-term preservation plan for analog visual archives.

However, I’m learning to spot when I need to temper that instinct. In their overview of Thirteen Ways of Looking At … Digital Preservation, Brian Lavoie and Lorcan Dempsey discuss both preservation and access. In their section presenting digital preservation as “a selection process,” they remind us that digitization can be expensive, and that it is advantageous for leaders of digitization efforts to think ahead regarding which objects they plan to digitize for preservation, and which they plan to preserve for access. While digitizing en masse and sorting later may sound appealing, particularly when unconcerned with storage space limits, they note that “saving is not preserving” – digitizing an entire collection at the highest possible quality is rarely affordable or a good use of time, but digitizing an entire collection at a lower quality and sorting back through for preservation would require re-digitizing selected resources, which is not an effective use of money or time, either.

This all comes back to stuff I’ve been chewing on thanks to other classes dealing with archives – namely, how do we as archivists actually know what people in the future will need high-quality digitizations of? The Mona Lisa is a pretty safe bet, but what if researchers in the future need high resolution scans of Da Vinci’s ‘boring’ handwritten records to study his penmanship? It’s impossible to anticipate every possible need, which Lavoie and Dempsey acknowledge when they refer to digital preservation as “an ongoing activity,” one that, with changing file formats and digitization technologies, is never truly complete. All archivists can do at this stage of the evolution of digitization technologies is make educated guesses based on current research patterns. Perhaps, as technologies evolve further, someone will create a miraculous workflow that allows for high quality, time sensitive digitization, and all of these problems will be solved – assuming, of course, that the technology is open source. Fingers crossed.

Lavoie, B. and Dempsey, L. (2004). Thirteen ways of looking at … digital preservation. D-Lib Magazine, 10(7-8). http://www.dlib.org/dlib/july04/lavoie/07lavoie.html

Matusiak, K. and Johnson, T. (2014). Digitization as a preservation strategy. The American Archivist, 77(1), 241-269. http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/pdf/mow/VC_Matusiak_Johnston_28_B_1400.pdf

Advertisements

Levels of Preservation: A Report Card

I’ve made no secret this term that our repository The Center for the Study of Tobacco and Society has a very… special way of going about its collection management, by this I mean there is no coherent schema that any professional collections manager would recognize. For some quick background our institution is almost completely geared toward producing online exhibitions, so much so that we don’t have an online content management system beyond the WordPress site where we upload material for display in our online exhibitions. Unsurprisingly this is a very poor way of providing access, infact there is no access to any of our material that is indexed and searchable the only way to get to our material is to find it in the exhibition in which it is featured.

Before this background bleeds into foreground I’ll sum it up quickly, CSTS is a mess. Now what does this have to do with the levels of preservation? Well I was looking at this chart here from the lecture and deciding how many of these levels we actually manage to cross. Whether by purposeful effort, clever negotiation, or just tripping into them; and which ones we fail miserably and how we can rectify that situation.

So let’s create a checklist then:

  • Storage and Geographic Location
  • File Fixity (Permanence) and Data Integrity
  • Information Security
  • Metadata
  • File Formats

With 4 levels all with varying conditions and directives…

  • Protect Your Data
  • Know Your Data
  • Monitor Your Data
  • Repair Your Data

So let’s make a quick assessment of how our institution does, then I will go into some detail with a few of these points, focusing on one we ace, one we fail and one we’ve either nearly made or nearly missed.

So here’s our report card: Green (+1 Point), Yellow (+.5 Point), Red (0 Point). Quick note, I’m giving us credit for Level 4 of Storage and Geographic Location, because we really need the points.

13.5/20 = 67.5%, not bad that’s almost a D+!

File Formats, Storage and Geographic Location:

We do exceedingly well at file format management, but this is a natural outgrowth of just doing our job. We don’t store things in formats they shouldn’t be kept in and that’s a convenience for us more than an institutional policy. Furthermore one of the aspects of my job is media capture and conversion so I have a stake in ensuring that our converted media is kept in such a ways as to be accessible and useable for a good long while to come. Do you know how boring it is to do digital capture of a VHS of a House Sub-Committee Meeting on the effects of Cigarette smoking and the tobacco settlement? DO you know how mind blowingly surreal it is to see congressman in a video clip from the mid 1990s on one monitor in your office and on your laptop being interviewed about mid-terms in 2018? Damn skippy I don’t want to have to do this again so yes file fixity, integrity, security the works is all in play here even if we fall flat on everything else.

The geographic contingencies have more to do with how monstrously disorganized our director is. There’s a good chance anything we have here has been duplicated three or more times, (hell we’ve found a literal stack of one document copied 30 times over, all in one place defeating the purpose, but still!) Also our “satellite locations” include, a pair of storage facilities (u-store its) in Houston and an undisclosed location in Pennsylvania (not top secret I just can’t remember), our Director’s house, and his office on the hill here on campus at the University Medical Center. So if one or many of these places exploded or vanished to the shadow realm we would have “contingencies”, so I’m giving us credit for those.

File Fixity, and Information Security:

We’re limited in what we can do with our file system as we are dependent on CCHS for our storage. This applies to our web storage beyond google drive. We have redundancy and integrity contingencies using UABox (University of Alabama’s Cloud Storage Service) Google Drive, CCHS’s Sharedrive, and a physical back up SSD that sits on my desk. Write protection is enabled by default which is good, but it doesn’t protect files outside of application use so that’s bad. Much of this is beyond our control and reflects very much how our collection is effectively a hobby or a luxury that UA CCHS tolerates and promotes when it suits them. There’s talk of a new facility built from the ground up but what we need are actual archivists to help us direct what will go into this new infrastructure as our director is again more interested in creating a museum space and space for his exhibitions, actual physical manifestations of the collection that SHOULD be the outgrowths of a properly managed collection.

Metadata

As I’ve mentioned in the past, CSTS doesn’t really have a proper CMS. We’ve had a mix of services and platforms but nothing to really help us create a consistent metadata schema and manage it properly by doing the actual data entry and processing needed to create a system that can pass muster beyond level 1. We have data fixity and security and a method of identifying what goes into our servers and onto our website, but that’s where the good news ends. Without actual metadata to speak of we can’t begin to store or transit it in level 2, nor can we store technical and descriptive metadata outside of placeholder file names that have the following format:

[Date-YYY-MM-DD] – [Publication/Author] – [Title/Description].{file}

It’s sufficient for a placeholder and can be searched using windows file explorer but is not entirely secure as it can be easily edited by changing filenames. So it’s a failure on all fronts save one and even that is a caveat check mark.

Conclusion

This report card underscores the fact that the Center for the Study of Tobacco and Society is a collection not an archive. Tangible steps have to be taken and will be taken in the next year to rectify these glaring issues and reinforce the successful policies that have been implemented going forward.

Digital Preservation and Collaboration at the State Level

I have always found digital preservation to be a fascinating subject. I think it is because of my background in the fields of cataloging, historic preservation and IT troubleshooting. It may also be that I realize what a labor intensive activity creating digital files and metadata to accompany digital resources is, and what an incredible loss corruption or accidental deletion of data would be.

It is my personal observation that data preservation is often an afterthought of digital librarians. This is very hard to understand. The cost involved with metadata and digital files is very high. Why are we so passive about protecting them? Perhaps we are just too accustomed to paper books which, if stored in relatively stable climate conditions, will sit unharmed on a shelf for hundreds of years and still be retain their usefulness. I recently participated in an email exchange with Wendy Robertson, Institutional Repository & Metadata Librarian for The University of Iowa Libraries. I am currently serving on a state consortium task force that is looking to offer consortium level access to digital library software for our approximately 50 libraries. We are considering the repository platform Digital Commons by Bepress, and are concerned about what type of data preservation options would be available to our client libraries. Ms. Robertson was contacted as a user of Digital Commons. According to Ms. Robertson, Digital Commons requires an extra fee for preservation. Her files are all stored both in Digital Commons and on an Amazon server. Keeping the back-up set of data is of course an important start, but she remains concerned about the lack of “any kind of proper preservation with bit sums etc.” She hopes Amazon is helping to keep an eye on the data’s stability, but is unsure if this is part of the services they are receiving form them (W. Robertson, personal communication, November 9, 2018).

All this really got me thinking about what could help libraries like Robertson’s and our consortium members get a handle on the thorny issue of data preservation, and I began to wonder if the answer could be collaboration.

In late 2014, librarians from Montana State University, University of Montana, Montana Tech, and the Montana Historical Society identified a common need to develop digital preservation workflows at their respective institutions (Mannheimer and Cote, 2017, p. 4). After surveying the needs of each library, they determined requirements that would fulfill their institutional needs for data preservation these included bit-level integrity checks, geographic distribution of storage servers, the ability to access content and associated metadata (not a dark archive), system usability, ability to accommodate multiple administrators and multiple logins (p. 7). Based on this criteria they found four commercial preservation services that they thought would be worthy of investigating. They were Preservica, Rosetta, DuraCloud, and MetaArchive (p. 8). The group’s experience in collaboration not only led to cost savings, but serves as a model for institutions searching for help to resolve their preservation challenges in that members benefited from “shared knowledge and expertise gained during the partnership” (p. 1).

In 2003-2004 the Indiana State Library hosted a digital library summit. At this summit five goals were established by representatives from libraries around the state. These goals included the development of a statewide portal for access to participating institutions’ digital projects; providing educational opportunities to demonstrate best practices in digitization; offer a revenue for all cultural heritage institutions within state boundaries to participate; promote the use of said collections by educational institutions; and provide a digital preservation solution for existing and future digital collections (May, 2017, p. 223). The Indiana State Library created the Indiana Memory Project site which served as the statewide portal to create access to digital collections, but individual participating libraries remained responsible for storage and maintenance of their own digital files. Several years later the State Library awarded a Library Services Technology Act grant for the development of “a statewide, community based, collaborative approach to digital preservation” (p. 225). Member organizations split the cost of a collaborative membership in the MetaArchive Cooperative Preservation Network. The Indiana State Library functioned as the fiscal agent, and the Cunningham Memorial Library at Indiana State University served as the technical supervisor of the LOCKSS server.

Digitization Preservation at the collaborative level maybe a fairly new concept, but there are examples that can be followed. Commercial preservation organizations have been willing to work with collaborative groups made up of small and medium sized institutions. Perhaps most encouraging, these commercial organizations have been willing to make such relationships cost effective for such libraries. I think if libraries and cultural institutions are going to continue to ensure their relevance for the long-term future, it is time they start guaranteeing their resources will survive long term. It would be nice if a federal institution would contribute to meeting these challenges, but I don’t see this becoming a priority in the current political environment. The next best thing may be the at the state consortium level.

Davina Harrison

References

Mannheimer, S., & Cote, C. (2017). Cultivate, access, advocate, implement, and sustain: a five-point plan for successful digital preservation collaborations. Digital Library Perspectives, 33 (2), 1-20. http://doi.org/10.1108/DLP-07-2016-0023

May, C.A. (2017). InDiPres: a statewide collaborative approach to digital preservation. Digital Library Perspectives, 33 (3), 221-230. http://doi.org/10.1108/DLP-08-2016-0035

The past as the key to the future

This post will share an article about digital preservation. The article is titled: Bridging the Two Fossil Records: Paleontology’s “big data” future resides in museum collections.

The authors in this article begin by introducing the concept of two distinct but intertwined fossil records. The first is the physical record consisting of material objects, the fossils themselves, either those residing in collections or remaining in nature awaiting future discovery (Allmon et al. 2018). The second is the abstracted fossil record, consisting of contextual and comparative information gathered by researchers, including any interpretations (Allmon et al. 2018). The entirety of the abstracted record is based on the physical record; the authors state the obvious, that this is simply how paleontology works (Allmon et al. 2018). Overtime the physical record requires reinterpretation, adding to the abstracted record; this cycle reveals that the physical record is the true source of data. While both records can be examined, Allmon et al. 2018 make very plain the benefits of studying or re-studying the fossil record. As primary data sources, fossils preserve information that may not be currently accessible, meaning new discoveries can be made from specimens collected hundreds of years ago, due to new technologies and knowledge. They also form the basis of biology and paleontology, as verification and replication of observations is essential to further advancing the fields and is a fundamental concept of science itself.

A look at paleontology history shows that in the 1970’s and 1980’s there was a shift in focus from studying specimens to data digitization and big data studies stemming from the published literature. This was innovative for the time and allowed for many questions to be answered that had been impossible before to even ask. Allmon et al. state that digitized, big data is still the future, but literature is a finite resource and much has scientifically and technologically developed since then. Current paleontology databases are good tools, but need improved for a variety of reasons. Firstly, there is a very small percentage of data digitally available compared to the potential scope as the vast majority of collections have not been digitized, thereby making the databases un-comprehensive (Allmon et al. 2018). Gaps in the digitized record are evident not only in which specimens and geographic locations are included, but also the metadata for each is not standardized, so some specimens have metadata which others don’t. This not only leads to interoperability problems, but also quality issues. Another quality concern is the reliability of taxonomic identifications; outdated, incorrect or missing identifications (a large proportion of museum specimens have not been identified at all) are a problem in a research database; incorrect information leads to incorrect results (Allmon et al. 2018). Increased engagement of researchers could alleviate some of these issues, however there is little if any reward system or incentives for them to do, as there is for writing publications and receiving tenure. The last major obstacle to digitizing specimen data is the availability of funding to ‘digitize everything’.  They include some crude calculations as to the cost of digitizing only the identified specimens in American museums, at $1 USD per specimen, to be about $75 million USD, and compare it to NSF’s $10 million USD allocated in the 2017 budget for Advancing Digitization of Biological Collections program for all natural history collections (Allmon et al. 2018). Seeing these rough totals, they conclude that digitizing all or even most is very unlikely. Questions then remain, such as what to prioritize and how much digitization is sufficient.

I found this article very interesting as it brings up points we’ve discussed during the digital preservation class, and other weeks’ topics. Their concerns are that while digitization of data has been able to answer some big data research questions already, it’s come to a point where the databases need maintenance and new data to continue being useful; we talked in class about digital preservation being an ongoing activity, and this is clear here. They bring up issues of bias in which data is excluded, in this case it’s identified specimens that are more likely to be included than the unidentified ones. We’ve talked about bias in LAM before, concerning which objects are included or how they are described. Being more inclusive will lead to a more robust digital collection, enhanced accuracy of data, and better research. This would happen in an ideal world; however it is quite sad that the funding simply isn’t there. I recall a paper from earlier in the course from Erway and Schaffner, who advocated for mass digitization as soon as possible to expose hidden collections, and enhance the quality over time as resources allow (Erway and Schaffner 2007). While this can be a positive approach to many projects, Allmon et al. are of the opinion that in this case, for fossil collections (even just the identified specimens) residing in the U.S., not even this is feasible.

References

Allmon, W.D., Dietl, G.P., Hendricks, J.R., Ross, R.M. (2018) Bridging the Two Fossil Records: Paleontology’s “big data” future resides in museum collections, in Rosenberg, G.D., and Clary, R.M., eds., Museums at the Forefront of the History and Philosophy of Geology: History Made, History in the Making: Geological Society of America Special Paper 535.

Erway, R., Schaffner J. 2007. Shifting Gears: Gearing Up to Get Into the Flow. OCLC Programs and Research. Published online.

Digitization Conundrum

In the modern age people still seem to see archives as a guarded treasure trove.  While this is not untrue it is far more accessible than many think.  On the other hand, people also seem to expect everything to already be digitized and waiting for them to ask for it.  This brings up an interesting point.  How do archivist know what to digitize first?  What will be asked for in the future by the majority of patrons?  At the McCall Library we have always been primarily known for our large photograph collection.  Despite having been around for forty years only about 5% of our negatives have been digitized and even less of our manuscript collections have been.  This is primarily due to a lack of staff, funding and equipment.

In 2011 the McCall Library received an unrivaled collection of papers from some of the most prominent families in Alabama history. This collection was valued at 3.1 million dollars.  One of the stipulations for our receiving the collection was that we would digitize the plantation ledgers.  We do not know why these were chosen as there are far more valuable research materials in this collection but digitize them we must.  Earlier this year I pointed this clause out to our new director of the libraries and I was tasked with finding a scanner that would work for both this project and ILL.

Our new director then stated we should make a priority list for items to be digitized after these ledgers.  This made us start wondering what would be most useful.  Having worked there for four years I have noticed the requested collections change roughly every 5 months and come in large waves.  This makes it difficult to determine what would be highest priority.  In the past we digitized as items were requested instead of making these decisions on our own.  Jones gives an interesting insight into why and how books were chosen for digitization in the past and how personal bias unknowingly plays a roll (2017).  When it comes to our negative collections we have always decided based on request and the condition of the negatives.  As they get closed to a century old we have started trying to speed up the digitization process but the side effects of working with the chemicals and lack of scanners and workers make this difficult.

At one point our direct worked with history pin to upload images of historic buildings where they once stood.  This allowed for her to make some interesting decisions on what to use in a book she coauthored.  However, a lack of interest made her abandon the project halfway through.  Cohen mentions how “playing” with history like this can make for some interesting discoveries (2010).  I find this to be very true.  We were once asked where the Spanish Alley in Mobile, AL was located but the exact location was not written anywhere.  I scoured though our pictures and google maps to compare the areas until I located the buildings on all sides of it and could conclusively locate it.  I was even able to figure out why it was nicknamed the Spanish Alley, it had once been next door to the Spanish consulate in Mobile.  Without “playing” around with these tools I could not have found the answer.

Another road block in deciding what to digitize is space both localized and on the hosted site.  Currently we are restricted in the space we have to host our collections, so we have our finding aids and roughly a hundred photographs displayed on there.  This frustrates our patrons to no end because they cannot simply access the collection they want.  Instead they have to contact us and wait for us to digitized it and pay for the cost in some instances.  Wu, Thompson, Vacek, Watkins and Weidner give a fairly detailed look at how to decide which hosting content to use for digital collections and why you should use them (2016).  Our institution is our main road block to providing the information our patrons request from us.  Hopefully as digitization becomes the norm we will be able to convince the president that this is a good use for the universities funds, space and time.

 

Referecnes

Cohen, P. 2010. “Digital keys for unlocking the humanities’ riches.” The New York Times.  Available at: http://www.nytimes.com/2010/11/17/arts/17digital.html

Jones, E. (2017). The public library movement, the digital library movement, and the large-scale digitization initiative: assumptions, intentions, and the role of the public. Information & Culture, (2), 229. Retrieved from http://libdata.lib.ua.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=edsglr&AN=edsgcl.494741834&site=eds-live&scope=site

Wu, A., Thompson, S., Vacek, R., Watkins, S., & Weidner, A. (2016). Hitting the road towards a greater digital destination: evaluating and testing DAMS at University of Houston libraries. Information Technology & Libraries, 35(2), 5–18. Retrieved from http://libdata.lib.ua.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=116674974&site=eds-live&scope=site

 

 

 

 

 

 

 

 

Digital Preservation

I recently talked with a coworker about digital preservation and preservation policies. The department that she works in is getting ready to acquire some new preservation software. More specifically, they are going to use Rosetta by Ex Libris. Rosetta completely covers data asset management and preservation for several types of institutions including museums and libraries. Seems like a great piece of software to implement into a library. Since my department wasn’t part of this process, we won’t be involved in using it. Why isn’t the digitization lab I work in part of this new endeavor? This wasn’t much of a surprise, the library I work at has a bit of a communication issue. I ranted about this in my last post if you need some back story, no need to repeat it here. Digital preservation combines policies, strategies and actions to ensure access to content that is born digital or converted to digital form regardless of the challenges of file corruption, media failure and technological change (American Library Association, 2010). I wanted to understand more about digital preservation policies and have an understanding how those policies are put into action.

It seems that having a preservation policy in place before starting down the long road of the actual preservation part would make everything run smoother. There are many examples to look at when researching different policies.  A few policies among many worth mentioning are Dartmouth College Library, Yale University Library and North Carolina Digital Preservation Policy. These policies follow a similar structure. They encourage access, sustainability, rights information, best techniques and collaboration. This is far from an exhaustive list. The policy gives the underpinnings for a preservation process. The collaboration part of the policy caught my attention, mainly because it is lacking in my place of employment. A successful example of this can been seen in the Smithsonian’s Time Based Media and Digital Art group. They came up with a plan and combined resources with several other branches in order to facilitate the best conservation practices for these digital materials. The list of contributors working on the same project is impressive. Research by Cote and Mannheimer (2017) show the co-implementation of a digital preservation service allows for cost sharing, collective training, collective trouble shooting and development of administrative expertise (pg. 110). Their research stems from four Montana librarians taking the initiative to develop a digital preservation strategy that would benefit libraries statewide. Collaboration is at the core of this process in order to help libraries that have little guidance and resources.

There was a slide in class during week eleven that focused on the levels of the preservation process. One level in particular that I took note of was the bullet point about the agreed upon actions throughout the cultural institutions. This point is another example of forming partnerships to achieve a goal. Finding ways to work with other departments and institutions would benefit everyone, it seems simple enough. Dempsey and Lavoie (2004) state that digital content must be easily shred between services or users; usable without specialist tools; surfaced in a variety of environments; and supported by consistent methods for discovery and interaction.

A final example of partnership in digital preservation is a newspaper digitization project in North Texas. Ana Krahmer (2016) suggests starting small, this way both groups can get a sense for what each will contribute to the overall project and each group can see how their contributions represents a portion of the sum of all parts (pg.85). The examples of collaboration I gave above involve several large institutions on one project. I believe it is beneficial to see this idea of partnership on a smaller scale. It may give rural communities the confidence to start a preservation initiative.

I like the idea of laying out a strategy so everyone is completely aware of how a project is going to develop. This seems easy enough, right? I suppose the way my library works leads me to being so intrigued with a fairly straightforward idea such as this. The same applies to metadata. Since both of these elements are missing from our workflow, it’s no wonder I’m fascinated by them. In the future, would be interesting to propose a digital preservation policy for my department and see how people react to it. Actually, the entire library would benefit from it. A future project maybe?

 

American Library Association. (2010, January 18.) Definitions of Digital Preservation. Retrieved from http://www.ala.org/alcts/resources/preserv/2009def

Krahmer, A. (2016). Digital newspaper preservation through collaboration. Digital Library Perspectives32(2), 73–87. https://doi-org.libdata.lib.ua.edu/10.1108/DLP-09-2015-0015

Lavoie, B. & Dempsey, L. (2004). Thirteen Ways of Looking at…Digital Preservation {computer file}. D-Lib Magazine10(7/8). https://doi-org.libdata.lib.ua.edu/10.1045/july2004-lavoie

Mannheimer, S. & Cote, C. (2017). Cultivate, assess, advocate, implement, and sustain. Digital Library Perspectives33(2), 100–116. https://doi-org.libdata.lib.ua.edu/10.1108/DLP-07-2016-0023

The Responsibility of Project Leaders

One often overlooked aspect of creating a sustainable project is the selection of an appropriate leader. Generally, this responsibility is automatically relegated to those with higher authority or senior staff. It may be wiser to instead choose a leader by looking at the attributes of each individual and determining who the best fit for a particular project is. But what leadership skills do project managers need? How does one keep order in a project is disharmony occurs?

“A critical part of the evaluation process as they look for a balance between leadership, planning, and institutional investment that is important even if grantees cannot answer every question about sustainability.”1 Leadership ability is not the only area that requires knowledge in order to lead a project. The ability to conserve finances, management skills, knowledge of long-term sustainability processes, and an aptitude for planning are all important for creating a project/program that is intended to last. Should the chosen library team be lacking in these traits, or any specific aspects of the project, it is necessary that the leader has many different skills involving marketing for funds, outreach on behalf of the project, and general editorial/technology skills.1 If no single employee seems to fit the attributes that would make a suitable project leader, don’t simply put the ‘best’ person on the job and move on. It may be necessary to develop the leadership and management skills of your staff through training.

“Effective leaders are those who apply the appropriate skills at the appropriate time for the appropriate situation.”2 Because a project leader maintains good progress through the work of a team, it is necessary that the team have continually good communication, common goals, and equal shares of responsibility set forth by the team’s leader. However, breakdowns in these can easily occur if the leader is neither direct with imperatives nor strong enough to hold control over the team. My own workplace is currently besieged by argumentation and petty disputes due to weak leadership, and I know first-hand the havoc wreaked by colleagues butting heads in a work environment. The greatest causes of such breakdowns occur from the absence of trust among team members, fear of conflict, lack of commitment, avoidance of accountability, and the inattention to results.2 Each of these can lead easily to another, and the team’s project might be at risk from such dysfunctional relationships. “Understanding each of the team dysfunctions and exploring ways to overcome them (i.e., focusing on achieving the opposite of each dysfunction) is a great test to one’s leadership skills. An effective leader assesses the team’s weaknesses, what team dysfunctions exist within the team, the causes of the dysfunctions, and apply ways to overcome the dysfunctions to improve team performance.”2

 

Kumar, V. S. (2009). Essential leadership skills for project managers. Paper presented at PMI® Global Congress 2009—North America, Orlando, FL. Newtown Square, PA: Project Management Institute. Retrieved at https://www.pmi.org/learning/library/essential-leadership-skills-project-managers-6699

Maron, L. Nancy and Loy, Matthew. (June 2011). Find for Sustainability: Practices Influence the Future of Digital Resources. JISC. https://sr.ithaka.org/wp-content/uploads/2015/08/Fundin