ABSTRACT

A need exists for the accurate measurement of the rural digital divide to most effectively direct public policy assistance. This study examined: Can the rural-urban digital divide be accurately measured? Two pilot studies were performed to develop, test, and evaluate an inexpensive technology alongside of social science metrics. This methodology measures from the consumer's perspective, matching broadband quality and availability measures with quality of life metrics. The successes and iterative improvement of the scientific approach is detailed. Recommendations for future projects that measure the rural-urban digital divide are provided for consideration to policymakers and researchers.

Rural areas continue to face digital inequality compared to urban areas.1 Urban areas have access to a myriad of next-generation advanced information communications technology (ICT) whereas rural areas experience disparity of service type, price, and reliability.2 Initially, the urban-rural digital divide was one dictated by quantity of subscribers or demand-driven digital inclusion.3 Although acute, this divide has been considered acceptable as the underlying medium was and is still not classified as a utility. However, the maturation of the Internet has shifted this medium into the driving force of the future; it is the medium that drives current and future education mechanisms, the future of precision agriculture, ubiquitous healthcare, as well as cultural connectivity. As the Internet becomes the driving force of our current and future societal needs, the issues of quality and connectivity that exacerbate the urban-rural digital divide have become impossible to ignore.4 More recent federal and state government funding initiatives have pushed investments toward rural broadband ICT infrastructure, but large geographical gaps in connectivity still exist.

Public policy and regulation have a direct impact upon availability (or lack thereof) for ICT services in rural areas. Hollifield, Donnermeyer, Wolford, and Agunga found that when public policy, in the form of universal service funding in rural high-cost areas, failed to support the early implementation of ICT services, communities began investing in self-development projects with only limited success. These efforts have only made a small step toward better rural infrastructure and do not begin to address the circumstances of rural households and agricultural operations that reside outside of community boundaries.

With the regulatory focus on increasing competition among ICT service providers since the passage of the Telecommunications Act of 1996 (TA96), consumers in urban areas have highly benefited, which has driven an even larger digital divide between urban and rural regions.5 This divide is most vividly perceptible when underserved (or even unserved) rural consumers of ICT often pay rural penalties, such as, higher prices, lower bandwidth, lack of reliability, few service provider choices, or no broadband service options at all.6,7 Although Whitacre and Mills argue that the real problem in rural areas is the lack of computing education and training programs, Internet availability and quality need to be addressed in the forefront before initiatives like community vitality and training programs can even hope to be effective.8

To begin to address the problem of quality and availability of Internet connectivity in rural areas, accurate connectivity measures need to be established. Thus, the driving research question for this Rural Measures study is, can the rural-urban digital divide be accurately measured?

Accurately measuring the urban-rural digital divide should not only focus on Internet availability and connectivity gaps, but should also be grounded in independent, scientific methodology. This Rural Measures study attempts to lay the groundwork for such a scientific methodology, while providing an in-depth look at a technological solution to measure the urban-rural digital divide. In order to lay the groundwork, a literature review is provided to examine policy and regulation in this area, rural technologies currently available, and the current measures available for measuring the urban-digital divide. The methodology focuses on a scientific approach that can visualize an accurate picture of the rural-urban digital divide. The article ends with a discussion of the adjustments made to the protocol, post-pilot, to the scientific methodology and the resulting recommendations that continue to be distributed.

Literature Review

To begin to establish better-quality Internet connectivity with wider availability of broadband speeds in rural areas, accurate measurements of such quality and availability must be established. Measuring will provide an accurate geographical picture to drive policy (and ultimately funding mechanisms) to the appropriate gaps to drive a truly universal system. Since policy and regulation have been one way, historically, of establishing universal service, past policy is examined in the following sections alongside of current technologies in rural areas. Current measurement philosophies are also surveyed alongside of the technical ideas of bandwidth versus consumer available throughput. These ideas are integral to understanding the Internet and its role in the urban-rural digital divide.

Policy and Regulation

Policymakers have attempted to mitigate the urban-rural divide for decades. Earliest attempts for rural residents obtaining services similar to urban residents can be found with the technological revolution of electrical power and the landline telephone. For example, in 1930, only 10% of American farms had electricity compared to 100% of urban areas. Three decades later, nearly 100% implementation of electrical power in rural areas had been achieved because of regulatory policy in the form of the federal Rural Electrification Administration.9

In 1907, the president of AT&T Theodore Vail announced a new slogan for the company, “one system, one policy, universal service.”10 The concept of service for all, or universal service for the telephone took hold with the Kingsbury Commitment in 1913 and was furthered when AT&T Vice President Nathan Kingsbury agreed to allow competitors to interconnect with the Bell telephone system. Universal service became a central theme in the Communications Act of 1934 codified at 47 U.S.C. §151 with the statute “… to make available, so far as possible, to all the people of the United States … a rapid, efficient, Nation-wide, and world-wide wire and radio communication service with adequate facilities at reasonable charges …”11 This Act prompted regulatory decisions to support low-income users and high-cost areas through subsidies from long-distance telephone service to local exchange carriers. This regulatory policy helped to support the growth of the telephone system into a nearly ubiquitous network, creating a solid base for the eventual information technology revolution.

More efforts for universal service through an attempt at equitable cost were found in the lifeline program for landline telephones, established in 1984 by the Federal Communication Commission (FCC).12 However, success of the universal service program has typically been measured by quantity of subscribers, which may not be the best measurement.13

Public Law 104-104 known more commonly as the TA96 was described by Congress as an act that enhances competition, lowers prices, encourages higher quality services, and facilitates rapid deployment of new technologies. TA96 codified the universal service theme in 47 U.S.C. §254 (b)(3), stating all consumers regardless of economic status or whether they live in rural or urban areas, insular or high-cost areas shall have access to reasonably comparable information services at comparable rates. When TA96 was signed into law on February 8th, 1996, the World Wide Web had been in the public domain for less than three years.14

Moving the concept of universal service into the age of the Internet, the FCC has attempted to promote digital inclusion and resolve inequality of bandwidth speeds between urban and rural residents through an updated version of the universal service program called the Connect America Fund.15 Modernization of the lifeline program occurred in 2016 when the FCC revised the program to include broadband services for low-income households, stating in the rationale, “Accessing the internet has become a prerequisite to full and meaningful participation in society.”16 As the societal importance of Internet access become more evident, Salemink, Strijker, and Bosworth state, “Developments so far indicate that telecommunication companies will not provide every rural household or business with a high-speed internet connection comparable to those in urban areas. Rural areas are served last, if they are served at all.”17 Salemink et al. conclude that economic differences between “well-connected” and “poorly connected” areas will continue to grow while other researchers point out that broadband is essential to support existing industries and attract new ones.18

From the historical implementation of the power grid and landline telephony in rural areas, regulatory policy may continue to be needed to help mitigate the urban-rural inequality of digital connectivity. However, directing policy funds appropriately to improve the Internet into a truly ubiquitous system remains an outstanding issue.

Technologies in Rural Areas

Due to the universal service policy of the FCC, the telephone and electrical networks are pervasive in society. As of the year 2000, the FCC reported that 94.1% of households had telephone service (predominantly landline).19 However, starting in November of 2001 the FCC began asking what type of telephone households used, with 1.2% of households indicating wireless only. Today, two out of five households have wireless-only service forcing the FCC to continue to refine this accessibility question.20

With landline telephone service, consumers have a direct (typically copper wire) subscriber connection back to the telephony central office. Because of the nearly universal nature of telephone service and already installed copper line infrastructure, dial-up Internet access was the primary access method. These lines were then further adapted for simultaneous voice and digital signal transmission called DSL.21 However, DSL had an initial length limitation of 2.2 miles22 due to signal attenuation of copper. This left most remote rural residents with the much slower, already outdated, dial-up access. Moving higher-bandwidth DSL broadband access closer to rural residents involved expensive backhaul upgrades. More recently, multiservice operators have capitalized on their cable TV infrastructures to deliver Internet. However, this infrastructure is not viable for households outside of the community boundary and especially not viable for households in extremely rural, low-density, population areas. As such, fiber optics became the ideal candidate for the backhaul long distances associated with rural customers.23

Advances beyond fiber optics have been made in closing the last-mile connection in rural areas. Wireless technologies have emerged for this last mile (i.e., fixed or terrestrial wireless, mobile wireless, or satellite based).24 In Canada and, more recently, the United States, fixed wireless providers were found to primarily target underserved areas and successfully provide customers with broadband Internet speeds.25 Broadband satellite can also reach remote areas but has not proven to be an effective competitor against fixed wireless technologies. Reliability of fixed wireless also remains an outstanding issue and availability of rural area providers is limited. The cost can also be prohibitive. Schneir, Rendon, and Xiong found that rural areas away from a community center cost an average of 80% more than that of small rural towns and villages to implement broadband service.

Measurement of Available Consumer Broadband

When Internet service is examined, many technical terms such as throughput and bandwidth are often used as synonyms. Bandwidth is the maximum theoretical capacity of any Internet medium to transfer data in a period of time and is defined in megabits per second (Mbps) or gigabits per second (Gbps). Throughput is the amount of actual capacity (often called speed) that is obtained from source to destination by the consumer on their Internet medium in a given period of time and is also measured in Mbps or Gbps. Latency is a measure of time delay reported in milliseconds (ms). All of these technical terms go directly to the availability and quality of Internet service. The FCC defines broadband as 25 Mbps download speed and 3 Mbps upload speed. Thus, the availability of broadband is currently defined as whether or not geographical areas have access to such speeds and the quality of the connection incorporates, in real-time, on whether the consumer is attaining such throughputs with low latency (not currently defined as a measurement in policy but noted as an issue). Thus, the “higher” the speed of a consumer connection and “lower” the latency, then better the quality. The term “available consumer broadband” is used in conjunction with these technical terms to refer to the overall availability and quality of Internet service through the United States.

By examining different Internet cost packages and available speed options, Obermier and as well as Obermier and Hollman found large disparities between urban and rural areas in areas of cost, availability of providers, and speed options. Schneir et al. found DSL was the least expensive option to provide up to 30 Mbps in towns and villages, but DSL could not provide a consistent 30 Mbps in rural areas. Fiber to the home was the most expensive option to provide 100 Mbps service, with a very high expense to rural customers, but could provide consistent speeds to any area.

Additionally, concerns exist with the measurement and reporting of the diffusion of broadband infrastructure.26 All facilities-based service providers providing Internet connection speeds exceeding 200 Kbps must report bandwidth speeds to the FCC through an official reporting process. Reporting of available bandwidth speeds initially occurred by zip code, and then by census tract, followed by a further adjustment to the level of census block, making longitudinal comparisons over multiple years difficult. Since 2014 broadband adoption rates are reported on the basis of a scale from 0 to 5 (with 5 meaning over 800 broadband connections per 1,000 households).27 This is a problematic measurement methodology for low-population density areas, not to mention it is self-reported by service providers. No independent auditing or cross-checking has been established. Grubesic as well as Lennett and Meinrath reported in a similar fashion that the first iteration of the national broadband map largely overestimated broadband availability in the United States. This is due, in large part, to the block reporting mechanism in which a provider can identify an entire geographic block as having coverage, when only part of the block may actually be covered. The block reporting mechanism is simply not granular enough to include coverage discrepancies.

Rural digital infrastructure disparities are documented among many countries, such as Australia, the United Kingdom, Brazil, Chile, and Canada.28 As such, measuring consumer available broadband has been through many different processes in the last few years. In the United Kingdom, 1,506 testing boxes were distributed in 2010 and automatically collected throughput data, but only during off-peak hours.29 This mechanism only established potential availability without testing for network bottlenecks or peak throughput issues. The FCC has been measuring consumer available throughput since implementation of the Broadband America project in 2011 through a partnership with the SamKnows company.30 Participants in the ongoing study install a router (whitebox) that performs speed tests to collect download speed, upload speed, and latency analysis, which is then compared to the participant subscription service. The FCC found in the Eighth Measuring Broadband America report that most of the providers participating in the study reported that the measured download speeds were 100% or better of advertised speeds during peak hours.31 However, service providers were notified and knew exactly where whiteboxes were placed, the sampling size was very low, and only a small portion of large service providers placed whiteboxes.

To further illustrate, SamKnows deployed very few whiteboxes in the least populated states in the United States.32,33 A total of 4,378 whiteboxes were deployed with only 16 Internet service providers volunteering for the project.34 Looking more in-depth at Nebraska, there are over 115 broadband providers; only 26 whiteboxes were deployed with no notation of the number of service providers tested in this location.35 This did not provide a model for geographically sound statistically sampling, nor did this model identify broadband availability or quality.

Other projects have attempted to crowdsource the quality and availability of Internet by using web portals such as Ookla. While helpful as a general direction, the crowdsourced mechanisms do not provide a geographical comparison beyond that of the county level, have no association to actual speed that the consumer should get, and do not provide a sampling of speeds beyond one reading per location at one point in time.36

All current funding mechanisms for rural broadband have been flawed at some major level. There is no established baseline for rural broadband; and current mapping efforts do not have sound statistical sampling to illustrate available consumer broadband. As such, the Rural Measures project seeks to establish a baseline and develop a statistically significant model for the Nation, in order to direct policy funding initiatives to the geographic locations with broadband gaps. This article provides a technical perspective intertwined with social science to provide transparency to anyone involved in policymaking. The following methodology illustrates the bond between social science and the technical complexities of measuring the broadband gaps, which is a necessity to accurately measure the rural-urban digital divide.

Methodology

The Rural Measures project has been through two pilot studies: (1) preliminary data collected with 65 participants and (2) more extensive data collected with 247 participants. Both pilot studies used a small collection device, called a Quantitative Throughput (QT) device, on the participant premises coupled with an online survey to provide a bottom-up, quantitative measuring approach. This approach was chosen in order to provide granular location throughput data matched with participant Internet plan information and participant perceptions concerning the Internet, which would provide a rich data set that does not yet exist. The goal in placing the QT's was to obtain a sample that could be generalized based on population density. The survey questions sought not only to provide a matched data set that addressed the question of broadband availability but also to provide context concerning Internet and rural quality of life. The ultimate goal of the Rural Measures project was to provide a baseline for accurate measuring and promotion of equitable rural broadband as a foundation for a model ecosystem that could then grow vibrant communities.

Project Collaboration

For the success of both pilot projects, collaboration efforts have largely centered around partnerships with Nebraska Public Power District (NPPD) and Nebraska Rural Electrification Association (NREA). Nebraska is unique in that the electrical power grid is publicly owned. When coupled with research interests from the university, this creates a public–public partnership opportunity. Because of a previous relationship with cost analysis research,37 NPPD funded the cost of 300 QTs for both pilot studies. In order to provide oversight into the Rural Measures project, a small board of key members from NPPD and NREA began having quarterly meetings with the researchers. Because of the direct relationship, the Rural Measures project has potential regulatory and legislative initiatives; both NPPD and NREA agreed to serve as distribution hubs for both outgoing and incoming QTs. NPPD and NREA also agreed to serve as a first-level help desk for participant questions regarding QT connections on the participant premise. Questions were first directed to NPPD/NREA, with the researchers serving as a secondary help desk.

Participants

Participants were sampled throughout the state of Nebraska during both pilot studies. However, the first pilot study used a snowball sampling approach, whereas the second study used stratified sampling. In the first pilot study, key members from NPPD and NREA distributed the devices among themselves and then other managers in their networks located across the state. In the second pilot study, the plan was to gain statistical power in each of the 49 legislative districts by class of city (Table 1). In examining these subpopulations, this state is densely urban in the east and very rural in the central and west portions. The second pilot study focused on four distinctly different legislative districts in terms of geographic population distribution.38

Nebraska Classes of Cities by Population Density

TABLE 1
Nebraska Classes of Cities by Population Density
Class of CityaPopulation Size
Metropolitan 300,000 or more people 
Primary Class 100,000–299,999 
First Class 5,000–99,999 
Second Class 800–4,999 
Village 100–799 
Class of CityaPopulation Size
Metropolitan 300,000 or more people 
Primary Class 100,000–299,999 
First Class 5,000–99,999 
Second Class 800–4,999 
Village 100–799 
a

Revised table, data retrieved from Nebraska Department of Economic Development, https://opportunity.nebraska.gov/files/research/stathand/asect7.htm.

Rural Measures Protocol

The Rural Measures project operates with two distinct methods of data collection. First, the QT collects upload throughput, download throughput, and latency data directly from the participants' premises and, second, Internet cost, Internet bandwidth plan, and Internet perception data are collected from the participants through an online survey. After the QT data collection is completed, the online survey data is matched with the QT data collection.

  1. QT software and hardware protocol. The QT device consisted of a Raspberry Pi with an add-on GPS39 module,40 which was connected to the open-source GPIO41 hardware connection on the Raspberry Pi. The Pi was packaged together with a how-to connection diagram42 and participant consent form. Since the QT is intended to be directly wired to the participant's network, the participant had to have one open port (through either a router or switch networking device) in order to participate in the study and an open power plug for the power cord. The QT then sat on the internal side of the participant's network. This allowed for accurate end device readings to be taken by the QT, which is an independent device and cannot be influenced by other devices on the network. By utilizing this setup, the QT also takes readings every five minutes, which allows for examination of participant temporal Internet patterns over 14 days (two weeks).

    To give the participant a visual indicator of status, the QT was equipped with two status lights on the top, indicating the overall health of the QT. In the first two pilot studies, a status of one red light and one green light indicated there was a problem with the GPS or Internet connectivity. Two green lights indicated that the QT was actively collecting data.

    In the early versions of the software, the QT data collection heavily relied on the human element of the distribution team. The researchers were optimistic that the distribution team would be able to regulate not only the placement of the device but also the timing of the data collection, as well. In the first pilot study, the QT data collection relied solely on the GPS hat to pinpoint the participant location. During the latter part of the second pilot study, a spreadsheet was introduced as a logging mechanism for the distribution team to record location data, as a secondary validator. As a result, the QT software consisted of open-source Python packages, which connected to the GPS hat and the Ookla®43 database for Internet speed test measurements and one shell script44 to run the open-source packages (Figure 1).

    FIGURE 1

    Software Flow of QT in First Two Pilot Studies with Heavy Human Elements.

    FIGURE 1

    Software Flow of QT in First Two Pilot Studies with Heavy Human Elements.

    Close modal
  2. Online survey protocol. The online survey through Qualtrics consisted of around 30–40 questions depending on how the participant answered the questions. The participants were prompted to complete the survey through the how-to connection diagram provided with the QT. The distribution team also prompted the participant to take the survey during install. The survey questions were split into three main sections with the fourth section being optional: (1) demographic information, (2) Internet cost and plan information, (3) Internet satisfaction and quality of life questions, and (4) dependent Internet satisfaction (optional). Descriptive analysis of the data showed a normal distribution in demographics in age and sex.

Results

Data in these two pilot studies indicated that there were differences in Internet connectivity between rural areas and classes of cities. Furthermore, the data trends began to show that the further one travels west in the state away from densely populated areas or any city center, the fewer service providers were available. Thus, the available Internet speed options from the provider would show a decrease and reported speeds trended in this manner. Because of geographic accuracy quirks, no charts or analysis of the data is provided in this article. The Rural Measures researchers outline an analysis of the QT software and hardware as well as the online survey component. This proves important as the validity and reliability of this model must be without error in proving this project as a model for the Nation.

QT Hardware and Software Analysis

During the first and second pilot studies, the GPS add-on module location data was found to be erratic. The most accurate GPS readings are obtained when the QT was placed in a window. In a majority of installs, the participant's Internet connection was on a lower floor, resulting in their being placed out of GPS range, with some installs placing the QT upside-down. This resulted in inaccuracies anywhere from 0.25 to 1.5 miles +/- from the actual participant location.

Since the QT's software did not have a finite finishing point for either its GPS or Internet speed testing, the number of onsite participant readings varied. Initially, the goal was a total of 14 days to capture two weekends worth of data. However, the distribution hubs' employees found it hard to schedule drop-off and pick-up times that precisely coordinated with the amount of data collection needed. So, the researchers had to continually monitor the QT readings and send reports to the distribution hub to indicate whether or not the unit was finished. This was neither efficient nor effective. Due to the faulty GPS readings, during the latter part of the second pilot, the distribution team was asked to keep a log in a spreadsheet. This distribution protocol also proved not effective. The spreadsheets kept by the distribution hubs were sometimes incomplete and the reporting provided by the research team showed errors. After the close of the pilot studies, the hardware and software were both thoroughly examined and resulting errors and explanations are provided in Table 2.

Errors Noted in Distribution Protocol between Research and Distribution Teams

TABLE 2
Errors Noted in Distribution Protocol between Research and Distribution Teams
ErrorDescriptionError Explained
#1 Research Team: QT is active and is not reporting a GPS location.
Distribution: QT is active. The location is unknown. 
Miscommunication of location data by distribution team. 
#2 Research Team: QT is not active.
Distribution: QT is active and installed at the participant site. 
Incorrect QT GPIO hardware wiring causes QT software crash. 
#3 
Research Team: QT is active and is reporting a GPS location.
Distribution: QT is active but GPS location is incorrect. 
GPS module not recording properly because of QT premise location. 
ErrorDescriptionError Explained
#1 Research Team: QT is active and is not reporting a GPS location.
Distribution: QT is active. The location is unknown. 
Miscommunication of location data by distribution team. 
#2 Research Team: QT is not active.
Distribution: QT is active and installed at the participant site. 
Incorrect QT GPIO hardware wiring causes QT software crash. 
#3 
Research Team: QT is active and is reporting a GPS location.
Distribution: QT is active but GPS location is incorrect. 
GPS module not recording properly because of QT premise location. 

Reliability and Validity of Online Survey from Pilot Studies

In order to provide reliability and validity to the online survey, the survey was cocreated by the research team through an iterative process in collaboration with board members from both NREA and NPPD. This iterative process ensured that the wording of the questions was at an appropriate level and that the survey also sought to answer needed measures. After the initial pilot (n = 65), the research team and board members again reconvened to make more small adjustments of the survey questionnaire wording. The online survey is geared to measure participants' perceptions and observations on three different constructs: (1) attributes of the household Internet connection, (2) overall satisfaction of the Internet connection from the adult perspective, and (3) overall satisfaction of the Internet connection from the household minors (18 years of age and younger). These elements were chosen based on Salemink et al.'s research and feedback received from the external advisory board created with NPPD and NREA.

In order to validate constructs, a principal component analysis (PCA) was performed. This analysis is based on the second pilot study results (n = 247). Prior to performing the PCA, a fitness test was done. Upon inspection, the correlation matrix showed that all variables had at least one correlation coefficient greater than 0.3. The Kaiser–Meyer–Olkin (KMO) measures showed overall 0.825. All individual measures were at 0.7 or above, which is acceptable. The exception to this rule was the household minors satisfaction group of questions. This was expected as there were not as many cases for the household minors' section. Bartlett's test of sphericity showed statistical significance (p < 0.0005). All measures indicated that the data was likely to be factored.

The PCA calculated six components, which all had eigenvalues of greater than one; these eigenvalues explained as 35.67%, 13.17%, 7.34%, 6.76%, 5.80%, and 5.65% of the total variance. However, visual inspection of the scree plot showed that three variables should be obtained before the reflection point. Although retaining six components led to total variance of 74.38% versus 56.18%, three components better met interpretability criterion. Thus, three components total were retained.

This three-component solution (Table 3) explained 56.18% of the variance. A Varimax orthogonal rotation was used with interpretability and this rotation illustrated simple structure. The final interpretation of the data aligned with three constructs: (1) Satisfaction with Internet, (2) Internet Cost Attributes, and (3) Internet Speeds. The first component, Satisfaction with Internet, encompassed all satisfaction indices (participants' and possible dependents satisfaction with rural Internet and their own connectivity). The second component, Internet Attributes, contained most Internet attributes (cost, user perception of needed speed, type of Internet medium, etc.). The third component aligned with Internet Speed attributes (user reported upload and download speed), with a weak loading from Rural-Pays-More.

Varimax Rotated Structure Matrix, Three Component with PCA

TABLE 3
Varimax Rotated Structure Matrix, Three Component with PCA
Rotated Component Coefficients
Components123Communalities
Satisfaction-Stable 0.659 −0.235 0.074 0.495 
Satisfaction-Speed 0.798 −0.161 0.080 0.670 
Satisfaction-Cost 0.390 −0.509a 0.092 0.419 
Satisfaction-Variety 0.648 −0.163 0.256 0.512 
Satisfaction-Needs 0.768 −0.173 0.113 0.632 
Satisfaction-Stable-Dep 0.915 0.022 0.051 0.840 
Satisfaction-Speed-Dep 0.911 −0.072 0.031 0.835 
Satisfaction-Overall-Dep 0.918 −0.040 0.109 0.856 
Satisfaction-Needs-Dep 0.908 −0.006 0.089 0.833 
Satisfaction-Rural-vs-Urban 0.579 −0.082 0.044 0.344 
Internet-Medium 0.096 −0.467 0.210 0.271 
Monthly-Internet-Cost −0.037 0.470 −0.076 0.228 
Download-Preferred-Speed −0.036 0.730 0.352 0.658 
Upload-Preferred-Speed −0.060 0.776 0.335 0.718 
Reported-Upload-Speed 0.057 0.066 0.891 0.802 
Reported-Download-Speed 0.143 0.071 0.824 0.704 
Rural-Pays-More 0.260 −0.198 0.317 0.208 
Rotated Component Coefficients
Components123Communalities
Satisfaction-Stable 0.659 −0.235 0.074 0.495 
Satisfaction-Speed 0.798 −0.161 0.080 0.670 
Satisfaction-Cost 0.390 −0.509a 0.092 0.419 
Satisfaction-Variety 0.648 −0.163 0.256 0.512 
Satisfaction-Needs 0.768 −0.173 0.113 0.632 
Satisfaction-Stable-Dep 0.915 0.022 0.051 0.840 
Satisfaction-Speed-Dep 0.911 −0.072 0.031 0.835 
Satisfaction-Overall-Dep 0.918 −0.040 0.109 0.856 
Satisfaction-Needs-Dep 0.908 −0.006 0.089 0.833 
Satisfaction-Rural-vs-Urban 0.579 −0.082 0.044 0.344 
Internet-Medium 0.096 −0.467 0.210 0.271 
Monthly-Internet-Cost −0.037 0.470 −0.076 0.228 
Download-Preferred-Speed −0.036 0.730 0.352 0.658 
Upload-Preferred-Speed −0.060 0.776 0.335 0.718 
Reported-Upload-Speed 0.057 0.066 0.891 0.802 
Reported-Download-Speed 0.143 0.071 0.824 0.704 
Rural-Pays-More 0.260 −0.198 0.317 0.208 
a

Denotes secondary complex loading.

The two outliers in this model were that of Rural-Pays-More and Satisfaction-Cost questions. The Rural-Pays-More question is a measure on a Likert scale of how much more participants think they should pay in rural areas for Internet. In examination of this analysis, the researchers have reworded this question for future distribution. The Satisfaction-Cost question was an outlier in that it had a complex loading. However, because this question measured satisfaction with cost, the heavier negative loading on the Internet Cost Attribute (−0.509) with a secondary loading on satisfaction (.390) aligns with the data. Participant's satisfaction with cost had a negative relationship with expected speeds and measured speeds.

Discussion and Future of the Rural Measures Protocols

In order to create a valid and reliable model, the data from the pilot studies was used to create a more reliable software flow and robust hardware model. This allows for validated data collection to occur. Within the online survey component, questions were readjusted in coordination with the previous analysis (Table 3).

The other main item that emerged was that of the best recommendations for accurate testing (Table 4). In the following sections, the hardware and software protocols and online survey protocol were designed, specifically, from lessons learned in the pilots. Table 4 reflects these best recommendations and practices for accurate testing in a completely nontechnical table to be used in policymaking decisions. The following sections discuss the future protocols and survey protocol.

Recommendations to Create a Robust Solution for Measuring the Urban–Rural Digital Divide

TABLE 4
Recommendations to Create a Robust Solution for Measuring the Urban–Rural Digital Divide
#Recommendations
An independent device must measure at the participant premise.a 
The independent device must be from an independent source (not owned or controlled by an Internet service provider). 
The independent device must be installed with sequential speed tests running without notifying or alerting the service provider. 
The independent device must be connected via a network cable (to avoid latency in the participant's wireless connectivity). 
The independent device must be connected to a source closest to the service provider's connection. 
The independent device must test sequentially over time to include both peak and off-peak hours. 
The sequential speed (throughput) tests must reach a level of statistical significance at each customer premise—at least 650 tests. 
The sequential speed tests must record over multiple days—preferably over the weekend and several weekdays. Recommendation is 7–10 days. 
The sequential speed tests must minimally run a test every 10 minutes. 
10 The sequential speed tests must be allowed to randomly seek out various server locations 
11 The sequential speed tests must be verified against the purchased Internet plan of the participant. 
12 The sequential speed tests must be tied specifically to the geographic location of the test—a physical street address is highly recommended. 
13 Optional. The sequential speed tests can be tied to participant indices such as reliability, availability, speed, latency, media type, and others for a rich data set that can help community vitality and digital inclusion efforts. 
#Recommendations
An independent device must measure at the participant premise.a 
The independent device must be from an independent source (not owned or controlled by an Internet service provider). 
The independent device must be installed with sequential speed tests running without notifying or alerting the service provider. 
The independent device must be connected via a network cable (to avoid latency in the participant's wireless connectivity). 
The independent device must be connected to a source closest to the service provider's connection. 
The independent device must test sequentially over time to include both peak and off-peak hours. 
The sequential speed (throughput) tests must reach a level of statistical significance at each customer premise—at least 650 tests. 
The sequential speed tests must record over multiple days—preferably over the weekend and several weekdays. Recommendation is 7–10 days. 
The sequential speed tests must minimally run a test every 10 minutes. 
10 The sequential speed tests must be allowed to randomly seek out various server locations 
11 The sequential speed tests must be verified against the purchased Internet plan of the participant. 
12 The sequential speed tests must be tied specifically to the geographic location of the test—a physical street address is highly recommended. 
13 Optional. The sequential speed tests can be tied to participant indices such as reliability, availability, speed, latency, media type, and others for a rich data set that can help community vitality and digital inclusion efforts. 
a

Special concern must be taken with metered Internet connections. The sequential speed tests will consume the metered Internet connection's data plan in a short time period.

  1. Future QT hardware and software protocol. In order to provide a more sustainable model, the QT's software was reprogrammed (Figures 2 and 3) to allow for a hands-off approach with no distribution team needed. The improved QT package is shipped directly to the participant in a small box that contains a validated how-to connection diagram45 and a consent form.46 The QT hardware was thoroughly inspected, and each QT was rewired to consistent standards to create hardware reliability. Two status lights remain on the top of the QT, but an additional color has been added for better participant experience (Table 5).

    FIGURE 2

    Overview Workflow of QT with Activation Feature.

    FIGURE 2

    Overview Workflow of QT with Activation Feature.

    Close modal
    FIGURE 3

    Detailed QT Software Flow with Reduced Human Elements.

    FIGURE 3

    Detailed QT Software Flow with Reduced Human Elements.

    Close modal
    TABLE 5

    Notes: QT consists of a Raspberry Pi, GPS hat, and status lights (attribution to Will Jones, UNK undergraduate student for soldering the status lights to the GPS hat and creating the first working program with the status lights).

    TABLE 5

    Notes: QT consists of a Raspberry Pi, GPS hat, and status lights (attribution to Will Jones, UNK undergraduate student for soldering the status lights to the GPS hat and creating the first working program with the status lights).

    Close modal

    The updated QT software is running on a hardened47 Raspbian Lite operating system. The programs needed to create the QT's software flow (Figures 2 and 3) utilize customized Python 3 software modules and system task schedulers.

    After receiving the QT in the mail and reviewing the how-to connection diagram, the participant is directed to activate the QT using a website (Figure 4). Previously during both pilot studies, errors were noted in the QT data collection process (Table 2). In order to specifically address Errors #1 and #3, the activation process was created. To ensure reliability, the QT id, state, and zip code are validated against a finite list of data entries. After entering the proper information, the QT is activated and can begin data collection (Figure 2). After properly submitting the QT online activation information, the participant is redirected to the online survey. Upon completion of the online survey, the participant is then able to view, in real-time, their upload and download Internet throughput (Figure 5).

    FIGURE 4

    Rural Measures Activator Web Page. Found at https://ruralmeasures.com/activator.

    Attribution to Adam Spanier, undergraduate student at UNK, for its creation.

    FIGURE 4

    Rural Measures Activator Web Page. Found at https://ruralmeasures.com/activator.

    Attribution to Adam Spanier, undergraduate student at UNK, for its creation.

    Close modal
    FIGURE 5

    Example Upload and Download Speed Chart That the User Can View after Taking Online Survey.

    FIGURE 5

    Example Upload and Download Speed Chart That the User Can View after Taking Online Survey.

    Close modal

    The QT runs a speed test utilizing the Ookla® service at five-minute intervals for a seven-day period (reduced from 14), resulting in 1863 total readings at one location. The QT self-regulates the testing period and prompts the user with its blue status lights when finished. The variables collected during these speed tests are upload and download throughputs, latency, and service provider company name.

    The QT does only measure the amount of throughput possible to itself while it is active on the network. This is a constraint of any device that is placed on a participant's network to measure the Internet attributes, even at the service provider device level. However, due to the frequency of speed tests, the QT is able to accurately measure the network throughput during very active network times (7:00 pm–10:00 pm) and nonbusy times (2:00 am–5:00 am); sometimes during a week day, the network will also display higher throughput from 10:00 am to 4:00 pm, while participants are away from the home.48 Google Analytics reports that worldwide daily web use collapses down to a trickle at 4:00 am. Other studies have found similar results with peak hours varying depending on the usage environment, but both report that the least amount of traffic flows during the very early hours of the morning, around 3:00 or 4:00 am.49

    Because this type of speed test consumes all throughputs possible during the test, participants with metered Internet were given a specialized QT that collects at 120-minute intervals for a longer data collection period of 14 days. For example, a participant has a daily 2 Gigabyte (GB) data cap and a 5 Mbps speed package from their service provider for both upload and download. Each speed test may take approximately 15 seconds total. During this 30 seconds (two tests—one for both upload and download), the test will attempt to max out the throughput. This means that one speed test could potentially use 150 MB of data. In a 24-hour period, this would effectively max out a 2 GB daily data cap. As such, these QT tests on metered connections are not as robust, as the measurements are quite sparse. After two weeks, the nonmetered connection would yield 168 separate tests versus 1,863 on the regular nonmetered test.

    After 100+ iterative debugging tests, the new version of the QT (version 2.0) has proven extremely robust. Through the newly developed software, hardened operating system, and the activation features, all of the previous protocol errors have been resolved.

  2. Online survey protocol. Since the QT collects QT measures directly from the participant's site and the online survey collects the participant's perceptions about their Internet speed and connection, an incredibly rich data set can be created with merging the QT data with the online survey data. The data between the QT and online survey elements (Table 6) is easily matched because of the QT id, time/date, and location stamp obtained through the activation site, and the QT id and location data obtained through the online survey. Both pieces match together for a rich, granular data set. This type of data set is not available anywhere else and can provide insight into further accurately measuring and then visualizing the rural and urban digital divide. As noted previously in Table 3, the survey provided valid results and, thus, the future measures will be obtained with confidence utilizing this survey instrument.

Survey Measures and Description

TABLE 6
Survey Measures and Description
Survey MeasureDescription
Demographics Participant records age, gender, marital status, household income, urban vs rural status, and distance from city center. 
Internet speed Participant records upload and download speeds from Internet bill and preferred speeds. 
Internet cost Participant records total costs of Internet connection only and costs of bundling, if present. 
Internet medium Participant records what type of Internet medium is used. 
Internet satisfaction of adult household members Participant records different Internet satisfaction measures including connection stability, cost, variety, speed, and overall. 
Quality of life indices Participant records quality of life indices such as the ability to do remote education, healthcare, and entrepreneurship initiatives. Also, records ability to game, video conference, and perform secure transactions. 
Survey MeasureDescription
Demographics Participant records age, gender, marital status, household income, urban vs rural status, and distance from city center. 
Internet speed Participant records upload and download speeds from Internet bill and preferred speeds. 
Internet cost Participant records total costs of Internet connection only and costs of bundling, if present. 
Internet medium Participant records what type of Internet medium is used. 
Internet satisfaction of adult household members Participant records different Internet satisfaction measures including connection stability, cost, variety, speed, and overall. 
Quality of life indices Participant records quality of life indices such as the ability to do remote education, healthcare, and entrepreneurship initiatives. Also, records ability to game, video conference, and perform secure transactions. 

Measures from QT combined with survey data can provide:

  1. The reported throughput speeds (upload and download) according to Internet medium type (DSL, satellite, cable, terrestrial wireless, fiber, and mobile).

  2. The cost of the Internet connection by reported speed package versus actual throughput.

    • a)

      This includes bundled and nonbundled costs.

  3. The participant's reported speed package versus the highest Internet speed offering available.

  4. The participant's reported speed package versus the actual throughput the participant receives.

  5. The number of participants who are reported to be on the highest speed package versus the recorded throughput.

  6. The participant's satisfaction with the stability, choice, speed, and affordability versus the Internet medium.

Discussion and Future Pilot Project

In the future pilot project(s), the researchers will be shifting the project collaboration efforts, due to interest by other groups. These other groups are both inside the state of Nebraska and outside, in order to keep proofing this model for a nationwide deployment. The recommendations model (Table 4) continues to gain interest through distribution to both policymakers and nonprofit groups. The researchers are hoping to continue to provide a bridge between such groups and the technical aspects needed to measure the urban-rural digital divide in a scientific manner. The participant pool will also shift slightly based on a better sampling methodology. As a result of the shift in focus of participants and collaborations, the Rural Measures protocols were redesigned. This redesign effort not only addresses the shift in focus but also solves the major errors previously noted in the QT hardware and software protocols.

Project Collaboration

In the future, the Nebraska Extension groups alongside other state groups will be part of this collaborative effort. Community leaders have also expressed an interest to participate based on community grant planning initiatives. Additional collaborations are underway with other states, service providers, communities, economic development groups, and other governmental agencies.

Participants

In the future, Geographic Information Science (GIScience) will be used to produce a stratified sampling methodology within the Nebraska classes of cities and across metropolitan, micropolitan, and rural areas to note where differences are occurring.50 Stratifying, in this manner, will allow for a robust sampling methodology that can extend outside of the state's boundaries and be used as a model for the Nation. Using this approach will also create a data set that can be easily analyzed using GIScience. The distribution and recruiting of participants are currently in progress. A best model for how to do so will be incorporated into our future research. Experiments with localized support agents and incentives are being explored.

Conclusion

The Rural Measures project provides a unique model that has now been truly vetted through two different pilot studies, with studies planned to further validate the new survey and QT hardware/software processes. After these pilots are complete, the model has potential to be used as the accurate measurement strategy for broadband, nationwide.

This methodology provides granular data from a bottom-up participant perspective at residences and small businesses. This data set is robust and provides a way to truly visualize the rural-urban digital divide. This provides a solid foundation for planning for community vitality initiatives, community digital readiness, community grant planning, and more importantly, for state, regional, and national policymakers. The visualization through the Rural Measures research project enhances current broadband mapping initiatives making this project important, as continued examination of rural broadband is deeply needed, particularly as broadband-intensive applications continue to make their way to rural areas (i.e., precision agriculture equipment, drone agricultural pesticide spraying and mapping, etc.).

In order to continue to present Rural Measures as a model to enhance and take broadband mapping initiatives to the next level, paving the way for digital inclusion and vital community initiatives, this article provides learned recommendations from the two pilot studies (with the third in planning processes) in Table 4. These recommendations should serve as the basis for other projects that are trying to enhance the visualization of the rural-urban digital divide by enhancement of broadband mapping and for policymakers who would like to incorporate a model for accurate testing of Internet service quality and availability.

FOOTNOTES

1.

Velaga et al.

2.

Townsend et al.; Obermier; Obermier and Hollman.

3.

Salemink, Strijker, and Bosworth.

4.

Hilbert, “The Bad News,” “Technological Information.”

5.

Grubesic and Murray, “Geographies of Imperfection.”

6.

Obermier; Obermier and Hollman.

7.

Anderson, Wallace, and Townsend.

8.

Whitacre.

9.

Lewis and Severnini.

10.

Mueller.

11.

Communications Act of 1934, 47 U.S.C. §151. Purposes of chapter.

12.

Hauge, Chiang, and Jamison.

13.

Zolnierek and Clausen.

15.

First Report and Order, 29 FCC Rcd 15644 (2014)

16.

Third Report and Order, Further Report and Order, and Order on Reconsideration, 31 FCC Rcd 3962 (2016)

17.

Salemink, Strijker, and Bosworth, 367.

18.

Whitacre, Gallardo, and Strover; Mahasuweerachai, Whitacre, and Shideler.

19.

Belinfante, “Telephone Subscribership.”

20.

Ibid.

21.

Lin and Liou.

22.

Prieger and Hu.

23.

Coomans et al.

24.

Nandi et al.

25.

McNally et al.

26.

Mack et al.; Grubesic and Murray, “Waiting for Broadband.”

27.

Mack et al.

28.

Thomas et al.; Philip et al.; Nishijima, Ivanauskas, and Sarti; Correa and Pavez; Pant and Odame.

29.

Riddlesden and Singleton.

30.

FCC, Tenth Measuring Broadband America Fixed Broadband Report.

31.

FCC, Measuring Fixed Broadband – Eighth Report.

33.

Exact numbers of whiteboxes in these least populated states was: 2 whiteboxes in Wyoming, 2 in Vermont, 0 in North Dakota, 34 in Alaska, 2 in South Dakota, 16 in Delaware, 9 in Montana, 12 in Rhode Island, 13 in New Hampshire, and 7 in Maine, for a total of 97 whiteboxes.

36.

Bischof et al.

37.

Obermier; Obermier and Hollman.

39.

Global Positioning System.

40.

Model: adafruit, MTK3329.

41.

General purpose input-output.

42.

This diagram gave specific instructions on how to connect a power and network cord to the participant premise.

43.

The Ookla database is a crowdsourcing tool used to test internet connections (https://speedtest.net).

44.

Computer program installed in the base Raspberry Pi operating system to run items at a specific time.

45.

Validated with an iterative approach with a small pilot, focus group.

46.

During the COVID-19 pandemic, an informational sheet about sanitization of the QT is also included.

47.

Hardening is defined in this project as using only the bare minimum of installed programs in order to perform the measurements needed specific to the QT.

48.

This proves especially true in 2020 with many people working from home during the COVID-19 pandemic.

49.

Cisco, Cisco Annual Internet Report (2018–2023) White Paper; Kihl et al.

50.

Combs et al.

BIBLIOGRAPHY

Anderson, Alistair R., Claire Wallace, and Leanne Townsend.
“Great Expectations or Small Country Living? Enabling Small Rural Creative Businesses with ICT.”
Sociologia Ruralis
56
, no.
3
(
2016
):
450
68
.
Belinfante, Alexander.
“Telephone Subscribership in the United States.”
Federal Communications Commission, Industry Analysis Division
,
2001
. https://transition.fcc.gov/Bureaus/Common_Carrier/Reports/FCC-State_Link/IAD/subs1100.pdf.
Belinfante, Alexander.
“Telephone Subscribership in the United States.”
Federal Communications Commission; Common Carrier Bureau
,
2001
. https://transition.fcc.gov/Bureaus/Common_Carrier/Reports/FCC-State_Link/IAD/subs1104.pdf.
Bischof, Zachary S., John S. Otto, Mario A. Sánchez, John P. Rula, David R. Choffnes, and Fabián E. Bustamante.
“Crowdsourcing ISP Characterization to the Network Edge.”
In
Proceedings of the First ACM SIGCOMM Workshop on Measurements Up the Stack, Toronto, Ontario, Canada, 2011
,
61
66
.
New York, NY
:
ACM
.
Cisco
.
Cisco Annual Internet Report (2018–2023) White Paper
.
San Jose, CA
:
Cisco, Inc.
,
2020
. Accessed January 1, 2021, https://www.cisco.com/c/en/us/solutions/collateral/executive-perspectives/annual-internet-report/white-paper-c11-741490.html.
Combs, H. Jason, Paul Burger, Christina Sogar, Julie Campbell, Timbre Wulf, Timbre Wulf, and Jody Van Laningham.
“Employing GIScience to Address the Perceived Needs and Service Use among Youth Offenders Preparing for Reentry to Rural and Urban Communities.”
Papers in Applied Geography
5
, no.
1–2
(
2019
):
119
25
.
Coomans, Werner, Rodrigo B. Moraes, Koen Hooghe, Alex Duque, Joe Galaro, Michael Timmers, Adriaan J. van Wijngaarden, Mamoun Guenach, and Jochen Maes.
“XG-fast: The 5th Generation Broadband.”
IEEE Communications Magazine
53
, no.
12
(
2015
):
83
88
.
Correa, Teresa, and Isabel Pavez.
“Digital Inclusion in Rural Areas: A Qualitative Exploration of Challenges Faced by People from Isolated Communities.”
Journal of Computer-Mediated Communication
21
, no.
3
(
2016
):
247
63
.
FCC, Tenth Measuring Broadband America Fixed Broadband Report, Office of Engineering and Technology
.
Measuring Broadband America/10
,
Washington, DC
:
FCC (OET)
,
2021
.
Grubesic, Tony H.
“The US National Broadband Map: Data Limitations and Implications.”
Telecommunications Policy
36
, no.
2
(
2012
):
113
26
.
Grubesic, Tony H., and Alan T. Murray.
“Waiting for Broadband: Local Competition and the Spatial Distribution of Advanced Telecommunication Services in the United States.”
Growth and Change
35
, no.
2
(
2004
):
139
65
.
Grubesic, Tony H., and Alan T. Murray.
“Geographies of Imperfection in Telecommunication Analysis.”
Telecommunications Policy
29
, no.
1
(
2005
):
69
94
.
Hauge, Janice A., Eric P. Chiang, and Mark A. Jamison.
“Whose Call is it? Targeting Universal Service Programs to Low-income Households' Telecommunications Preferences.”
Telecommunications Policy
33
, no.
3–4
(
2009
):
129
45
.
Hilbert, Martin.
“Technological Information Inequality as an Incessantly Moving Target: The Redistribution of Information and Communication Capacities between 1986 and 2010.”
Journal of the Association for Information Science and Technology
65
, no.
4
(
2014
):
821
35
.
Hilbert, Martin.
“The Bad News is that the Digital Access Divide is here to Stay: Domestically Installed Bandwidths among 172 Countries for 1986–2014.”
Telecommunications Policy
40
, no.
6
(
2016
):
567
81
.
Hite, James.
“The Thunen Model and the New Economic Geography as a Paradigm for Rural Development Policy.”
Review of Agricultural Economics
19
, no.
2
(
1997
):
230
40
.
Hollifield, C. Ann, Joseph F. Donnermeyer, Gwen H. Wolford, and Robert Agunga.
“The Effects of Rural Telecommunications Self-development Projects on Local Adoption of New Technologies.”
Telecommunications Policy
24
, no.
8–9
(
2000
):
761
79
.
Kihl, Maria, Per Ödling, Christina Lagerstedt, and Andreas Aurelius.
“Traffic Analysis and Characterization of Internet User Behavior.”
In
International Congress on Ultra Modern Telecommunications and Control Systems, Moscow, Russia, Oct 18–20, 2010
. https://doi.org/10.1109/ICUMT.2010.5676633.
Lennett, B., and S. Meinrath.
“Map to Nowhere.”
Slate
,
2011
.
Lewis, Joshua, and Edson Severnini.
“Short-and Long-run Impacts of Rural Electrification: Evidence from the Historical Rollout of the US Power Grid.”
Journal of Development Economics
143
(
2020
):
102412
.
Lin, David W., and M.-L. Liou.
“A Tutorial on Digital Subscriber Line Transceiver for ISDN.”
In
1988, IEEE International Symposium on Circuits and Systems, Espoo, Finland, June 7–9, 1988
. https://doi.org/10.1109/ISCAS.1988.15055.
Lyons, Daniel.
“Reforming the Universal Service Fund for the Digital Age.”
In
Communications Law and Policy in the Digital Age
, edited by Alfred C. Yen,
123
35
.
Durham, NC
:
Carolina Academic Press
,
2012
.
Lyons, Daniel.
“Narrowing the Digital Divide: A Better Broadband Universal Service Program.”
UCDL Review
52
(
2018
):
803
.
Mack, Elizabeth A., William H. Dutton, R. V. Rikard, and Aleksandr Yankelevich.
“Mapping and Measuring the Information Society: A Social Science Perspective on the Opportunities, Problems, and Prospects of Broadband Internet Data in the United States.”
The Information Society
35
, no.
2
(
2019
):
57
68
.
Mahasuweerachai, Phumsith, Brian E. Whitacre, and David W. Shideler.
Does Broadband Access Impact Population Growth in Rural America?
No. 319-2016-9723
.
2009
.
McNally, Michael B., Dinesh Rathi, Kris Joseph, Kris Joseph, and Amy Adkisson.
“Ongoing Policy, Regulatory, and Competitive Challenges Facing Canada's Small Internet Service Providers.”
Journal of Information Policy
8
(
2018
):
167
98
.
Mueller, Milton.
Universal Service: Competition, Interconnection, and Monopoly in the Making of the American Telephone System
.
Cambridge, MA
:
American Enterprise Institute/MIT Press
,
1997
.
Nandi, Somen, Saigopal Thota, Avishek Nag, Sw Divyasukhananda, Partha Goswami, Ashwin Aravindakshan, Ashwin Aravindakshan, and Biswanath Mukherjee.
“Computing for Rural Empowerment: Enabled by Last-mile Telecommunications.”
IEEE Communications Magazine
54
, no.
6
(
2016
):
102
9
.
Nishijima, Marislei, Terry Macedo Ivanauskas, and Flavia Mori Sarti.
“Evolution and Determinants of Digital Divide in Brazil (2005–2013).”
Telecommunications Policy
41
, no.
1
(
2017
):
12
24
.
Obermier, Timothy R.
“Residential Internet Access Cost in Nebraska.”
Great Plains Research
28
, no.
2
(
2018
):
149
54
.
Obermier, Timothy R., and Angela K. Hollman.
“Economic Impact of National Exchange Carrier Association Tariffs on Internet Access Cost in Rural Areas.”
Mountain Plains Journal of Business and Technology
21
, no.
2
(
2020
):
1
16
.
Pant, Laxmi Prasad, and Helen Hambly Odame.
“Broadband for a Sustainable Digital Future of Rural Communities: A Reflexive Interactive Assessment.”
Journal of Rural Studies
54
(
2017
):
435
50
.
Philip, Lorna, Caitlin Cottrill, John Farrington, John Farrington, and Fiona Ashmore.
“The Digital Divide: Patterns, Policy and Scenarios for Connecting the ‘Final Few’ in Rural Communities across Great Britain.”
Journal of Rural Studies
54
(
2017
):
386
98
.
Prieger, James E., and Wei-Min Hu.
“The Broadband Digital Divide and the Nexus of Race, Competition, and Quality.”
Information Economics and Policy
20
, no.
2
(
2008
):
150
67
.
Riddlesden, Dean, and Alex D. Singleton.
“Broadband Speed Equity: A New Digital Divide?”
Applied Geography
52
(
2014
):
25
33
.
Salemink, Koen, Dirk Strijker, and Gary Bosworth.
“Rural Development in the Digital Age: A Systematic Literature Review on Unequal ICT Availability, Adoption, and Use in Rural Areas.”
Journal of Rural Studies
54
(
2017
):
360
71
.
Schneir, Juan Rendon, and Yupeng Xiong.
“A Cost Study of Fixed Broadband Access Networks for Rural Areas.”
Telecommunications Policy
40
, no.
8
(
2016
):
755
73
.
Silva, Simone, Narine Badasyan, and Michael Busby.
“Diversity and Digital Divide: Using the National Broadband Map to Identify the Non-adopters of Broadband.”
Telecommunications Policy
42
, no.
5
(
2018
):
361
73
.
Thierer, Adam D.
“Unnatural Monopoly: Critical Moments in the Development of the Bell System Monopoly.”
Cato Journal
14
(
1994
):
267
.
Thomas, Julian, Jo Barraket, Scott Ewing, Trent MacDonald, Trent MacDonald, and Julie Tucker.
“Measuring Australia's Digital Divide: The Australian Digital Inclusion Index 2016.”
Melbourne, Australia
:
Swinburne University of Technology, Centre for Social Impact
,
2016
. https://doi.org/10.4225/50/57A7D17127384.
Townsend, Leanne, Arjuna Sathiaseelan, Gorry Fairhurst, and Claire Wallace.
“Enhanced Broadband Access as a Solution to the Social and Economic Problems of the Rural Digital Divide.”
Local Economy
28
, no.
6
(
2013
):
580
95
.
Velaga, Nagendra R., Mark Beecroft, John D. Nelson, David Corsar, and Peter Edwards.
“Transport Poverty Meets the Digital Divide: Accessibility and Connectivity in Rural Communities.”
Journal of Transport Geography
21
(
2012
):
102
12
.
Whitacre, Brian E.
“The Diffusion of Internet Technologies to Rural Communities: A Portrait of Broadband Supply and Demand.”
American Behavioral Scientist
53
, no.
9
(
2010
):
1283
303
.
Whitacre, Brian E., and Bradford F. Mills.
“Infrastructure and the Rural—Urban Divide in High-speed Residential Internet Access.”
International Regional Science Review
30
, no.
3
(
2007
):
249
73
.
Whitacre, Brian, Roberto Gallardo, and Sharon Strover.
“Does rural broadband impact jobs and income? Evidence from spatial and first-differenced regressions.”
The Annals of Regional Science
53
, no.
3
(
2014
):
649
670
.
Zolnierek, James, and Torsten Clausen.
“Local Telephone Rate Structure and Telephone Penetration: A Universal Service Perspective.”
Information Economics and Policy
22
, no.
2
(
2010
):
153
63
.
This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.