2015
DOI: 10.18438/b8pw32
|View full text |Cite
|
Sign up to set email alerts
|

Urban Public Libraries Do Not Yet Meet Benchmarks for Web Accessibility by Individuals with Disabilities

Abstract: A Review of: Maatta Smith, S. L. (2014). Web Accessibility Assessment of Urban Public Library Websites. Public Library Quarterly, 33(3), 187-204. http://dx.doi.org/10.1080/01616846.2014.937207 Abstract Objective – To determine the extent to which urban public libraries in the United States of America provide web sites which are readily accessible to individuals with disabilities with reference to the Urban Library Council’s EDGE initiative (specifically Benchmark 11, “Technolog… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…The fact that the numbers do not reflect true accessibility has been confirmed by Yoon et al (2016aYoon et al ( , 2016b, who conducted a side-by-side comparison of an automated checking tool with six visually impaired testers and found little correlation between what the tool and the users deemed as actual problems. Mixed-method studies are becoming more common, where researchers supplement automated accessibility testing with manual evaluation against checklists to catch additional usability issues (Conway, 2011;Conway et al, 2012;Oud, 2012;Billingham, 2014;Maatta Smith, 2014;Glusker, 2015). Several articles report additional testing with assistive technology, but this was carried out almost exclusively by users who do not identify as disabled.…”
Section: Evaluating Accessibilitymentioning
confidence: 99%
“…The fact that the numbers do not reflect true accessibility has been confirmed by Yoon et al (2016aYoon et al ( , 2016b, who conducted a side-by-side comparison of an automated checking tool with six visually impaired testers and found little correlation between what the tool and the users deemed as actual problems. Mixed-method studies are becoming more common, where researchers supplement automated accessibility testing with manual evaluation against checklists to catch additional usability issues (Conway, 2011;Conway et al, 2012;Oud, 2012;Billingham, 2014;Maatta Smith, 2014;Glusker, 2015). Several articles report additional testing with assistive technology, but this was carried out almost exclusively by users who do not identify as disabled.…”
Section: Evaluating Accessibilitymentioning
confidence: 99%