Skip to main content

Social life-cycle assessment (S-LCA) of residential rooftop solar panels using challenge-derived framework

Abstract

Background

Social life-cycle assessment (S-LCA) provides a framework to evaluate the social impacts of decisions made during the design phases of a product. Rooftop solar panels are considered an environmentally friendly renewable energy technology due to their ability to generate electricity without producing greenhouse gases while generating electricity. This study presents the application of a challenge-derived S-LCA framework to assess the social impacts of rooftop solar panels in the southeast region of the United States (U.S.) during the use and end-of-life phases.

Methods

The challenge-derived S-LCA framework was developed based on a set of challenges to performing social assessments. The challenges were identified through a systematic mapping process and verified using expert feedback. Additional feedback is gathered through users from mechanical engineering capstone design students. The case study application shown in this paper aims to identify the potential social impacts at a pre-implementation stage of the rooftop solar panel in residential applications. The framework follows the ISO 14040 LCA structure, and the analysis was performed based on impact indicators (Type-I framework) and performance reference points (PRP).

Results

The framework implements existing social impact assessment methodologies, and guides each of the assessment stages based on the type of analysis performed. The results highlight the workers as the stakeholder group with the highest social impacts. The results also highlight the need for regulation to make rooftop solar panels accessible to low-income community members.

Conclusions

An S-LCA framework to assess the social impacts of product systems and technologies is implemented to evaluate the potential social impacts of residential rooftop solar panels. The framework presented applies to product systems and technologies at a pre- or post-implementation state, and it aims to guide novice and expert users alike. Nonetheless, further research is still needed to improve the methodology presented, and additional case studies should be performed to test the applicability of the framework across a broad set of fields.

Background

Social life-cycle assessment (S-LCA) is defined as a technique to assess the social and socio-economic aspects of products and their potential positive and negative impacts along their life cycle on different stakeholders [1]. Relative to the economic and environmental dimensions of life-cycle assessment, the social dimension lacks focus and consolidation [2]. Over the last decade, the social dimension has seen a surge in interest from researchers [3, 4], which is attributed to a multitude of factors. One factor is the publication of the 2009 Guidelines for the Social Life-Cycle Assessment of Products from the United Nations Environmental Program (UNEP) and the Society of Environmental Toxicology and Chemistry (SETAC) [1]. The guidelines adopt the environmental life-cycle assessment ISO 14044 structure [5] and extend it to perform a social assessment of a product system based on impact indicators. Their publication provided a significant contribution to the field that has prompted the development of frameworks, handbooks, and case studies [6, 7]. The 2009 UNEP/SETAC S-LCA served as a significant starting point in the development of the framework presented in this paper. More importantly, the 2011 UNEP/SETAC Methodological Sheets [8] not only serve as a source of indicators gathered in Additional file 1, but they also provide the stakeholder categories adopted in our analysis. The widespread adoption of the UNEP/SETAC methodology was key in the decision to maintain the overall S-LCA structure in the framework presented in this paper.

With individuals and communities as the focus of social impact assessments, S-LCA has been implemented to better understand the social impacts of renewable energy sources. The social impacts of mainstream energy-generating methods, such as nuclear and fossil-fuel technologies, have shown disproportionate impacts on lower-income populations [9, 10]. Such higher impacts are usually the result of the production of greenhouse gases, water and natural resource pollution, and land expropriation [11]. Renewable energy technologies are perceived as environmentally friendly due to their ability to generate electricity without producing emissions during their use phase [12]. Unfortunately, so-called green technologies also produce harmful emissions when one considers their complete life cycle. Taking solar photovoltaic panels as an example, their production is known to involve the extraction and processing of rare-earth materials that are known to have a significant environmental impact [13]. Adopting a life cycle analytical approach is essential to have a holistic understanding of the social impacts of renewable electricity sources.

Table 1 provides a summary of previous studies that have investigated the social impacts of different renewable energy sources. The tabulated information provides several observations that make social assessments such a challenge to perform. The first observation is the wide range of variation in the characteristics of the renewable energy systems evaluated in these studies. The studies perform social assessments on systems such as a Bioenergy generation plant Fedorova and Pongrácz [14], wind turbines Schlör et al. [15], hydrogen production [16], aviation biofuels [17], solar thermal power plants [10, 18], PV modules [19,20,21], and PV power plants [22]. The second observation to note is the diverse geographical locations of the systems being analyzed: North America, South America, the European Union, and Asia. The diverse geographical locations will always provide variability in the goal and scope of the analysis because different geographical locations may also provide variations in political, environmental, regulatory, and economic characteristics, which may all affect the results of the social assessments. The third observation to consider is that the variability of methodologies used to perform the social assessment. Although previous studies have shown the dominance of the S-LCA framework when performing social assessment studies [26], additional methods such as the Cumulative Social Effects Framework [14], the Framework for Integrated Sustainability Assessment (FISA) [10], and a combined method of Sen’s Capability Approach with Social Hotspot Database Indicators (SHDB) [16] are identified. The fourth and last observation of these studies is that they consider different phases of the life-cycle process of the system studied. Some studies perform the social assessments by considering only the production phase, the production, and manufacturing phases [20], while others focus on the complete life cycle.

Table 1 Summary of previous S-LCA studies on renewable energy sources

The previous observations were considered when developing the methodology presented in this document. First, our methodology must be adaptable to the different technical characteristics of existing and future renewable energy production systems. This is a necessary characteristic of our proposed framework because we envision analysts comparing the social performance of different renewable energy systems, and these may include current and future innovations of which we don’t have technical knowledge at the present. Another important aspect is that our framework must accommodate the differing characteristics resulting from different geographical locations. We understand that locality is an important factor when assessing for social impacts of individuals, and we want to ensure that analysts can use our framework regardless of the location of the analysis. The last thing to consider from these studies is that our proposed framework should accommodate studies that either perform a full life-cycle assessment or just focus on a single activity of the life cycle.

A few of these studies evaluate the social impacts of PV panels, which is of utmost importance based on the case study presented in this paper. Traverso et al. [20], performed a life-cycle sustainability assessment (LCSA) of the assembly step for PV modules produced in Germany and Italy. The social component of their analysis adopts an S-LCA approach focused on the workers involved in the product chain. Their results show significant differences among the social impacts of PV modules on workers of those two countries, and it also highlights the importance of regional characteristics in S-LCA. Dubey et al. [21] summarize the positive and negative social impacts of PV modules. Their analysis highlights the need for a better understanding of the social impacts of PV modules since most studies focus on the workers' stakeholder group. These studies highlight the importance of considering social impacts when evaluating the sustainability of environmentally friendly technologies and fuels, as they might reveal negative social impacts that are not identified when only focusing on their environmental impacts.

In the present paper, an S-LCA of rooftop solar panels is presented. The presented framework combines existing knowledge in a novel manner to address a set of identified challenges when performing an S-LCA. An explanation of the development steps of the framework is presented in detail in “Challenge-derived framework development” of this article. In “S-LCA framework and implementation”, the framework itself is presented, along with an explanation of how it should be implemented to perform a social assessment. The framework adheres to the LCA structure presented on the ISO 14040 standard for environmental LCA [23], which is organized in the following stages: goal and scope, inventory analysis, impact assessment, and interpretation of results. The social impact calculation is based on a set of indicators that are specific to each study along with performance reference points (PRP).

For this study, only the use and end-of-life phases of the solar panel life cycle are considered relevant. Transportation impacts for the use and end-of-life phases were not included as part of the analysis because it is too early to have significant knowledge on the locations of the different processes. It was decided that transportation impacts should be considered at a later stage where more is known about the infrastructure for the panels as this allows for a better estimation of the transportation impacts. The analysis is geographically focused on the southeast region of the United States (U.S.). The learnings from such a study should only be extended to regions that share similar social and geographical characteristics. The U.S. is among the top ten electricity-consuming countries worldwide, being second only to China, a country with a population more than four times that of the U.S. [24]. Due to its high electricity consumption, the implementation of renewable energy systems in the US is significant from a national and global aspect. The analysis presented in this document aims at complementing the environmental benefits of residential solar panels with social impact knowledge. With regard to the stakeholders of interest, the analysis aims to understand the potential social impacts of the solar rooftop panels for workers, consumers, the local community, and society.

The work presented in this paper adds to the existing literature on the social impacts of renewable energy sources [12, 25]. An increased understanding of the social impacts of residential rooftop solar panels may be useful to other regions inside and outside of the U.S. that are considering the potential social impacts of their use.

Data and methods

Challenge-derived framework development

Before presenting the case study analysis and results, the authors want to explain the process followed to develop the analysis framework. Explaining the framework development is necessary for readers to understand the origin of the framework, how is it organized, its instructions for implementation, and more importantly the purpose of the case study presented in this article. The framework development process is divided into the following three stages: systematic mapping of the social assessment field, SIA framework development, and case study testing. The purpose of the systematic mapping is to identify key challenges to performing social assessments. An expert feedback study is then performed to verify the identified challenges. Using these learnings, a prototype social assessment framework is developed. Feedback on the framework is gathered through user feedback from capstone design students. The last part of the framework development process is case study testing, which is the work presented in this manuscript.

The following sections present the framework development process for the reader to better understand the origins of the framework, its objectives, and how it is expected to be implemented before showing any results.

Systematic mapping of social assessment field

The first stage of developing the framework presented involved a systematic mapping of the social assessment field. The research question investigated in the systematic mapping was the following: What are the current methods available to perform social impact assessments, and how have these been implemented? The reader is directed to [26] for a detailed explanation of the systematic mapping study. The systematic mapping reviewed 81 articles, of which 49 included a case study application, and nine were non-peer-reviewed articles. The main outcome was the identification of twelve recurring challenges to performing SIA. The challenges along with related journal articles are summarized in Table 2.

Table 2 Identified challenges to performing SIA

Expert feedback study

To evaluate the validity of the identified challenges, expert feedback was collected through online surveys. Six experts provided feedback to Likert scale and open response type questions. The questions focused on evaluating the following criteria: relevance or validity of the challenge, frequency of encountering the challenge, and importance of the challenge. For the relevance or validity criteria, the experts had the following answer options: yes, maybe, no. For the frequency of encountering the challenge criteria, the experts had the following answer options: always, sometimes, rarely or I don’t perform these types of assessments. For the importance of the challenge criteria, the experts had the following answer options: very important, moderately important, slightly important or I don’t know. A space for open feedback was also provided for each challenge. The experts consisted of active researchers in the areas of E-LCA and S-LCA, and they were in the United States and the European Union.

The contacted experts are researchers in the fields of S-LCA and/or E-LCA. Each challenge was either provided support, mixed support, or no support from the experts. Based on the expert feedback, the number of challenges was reduced from 12 to 10, by eliminating challenges number 10 and number 11, which were considered to belong more with the design of the assessment rather than challenges to performing it. Table 3 shows the results from the expert surveys. The main learning is that challenges #10 and #11 were removed from the initial list as these are considered more a part of the study design rather than a challenge itself.

Table 3 Summary of expert feedback regarding the challenges

Novice user study

The novice user study involved undergraduate senior capstone students from the Georgia Institute of Technology, located in the city of Atlanta, GA. The students were provided with a 50-min lecture on the topic of S-LCA, along with an example of an S-LCA of a laptop computer. As part of the lecture, the students were provided with a simplified version of the S-LCA framework that did not include the impact assessment portion of the analysis. The impact assessment stage was removed for the novice users because of the significant time and data resources needed to complete this stage, something that is not feasible for the students to do properly in less than one semester. For most of the students, the S-LCA lecture was the first time that they were introduced to the topic of social impacts, so performing a full S-LCA was deemed too overwhelming and time-intensive. Instead, the focus of the lecture and the exercise was to provide students with the knowledge to craft an S-LCA analysis. The students were instructed to follow the framework and to use the United Nations Environmental Programme/Society of Environmental Toxicology and Chemistry (UNEP/SETAC) guidelines as a source of social impact categories and indicators for their analysis. Feedback data were collected electronically from the students regarding the usefulness of the framework to complete their analysis and to collect any additional feedback.

The reports were assessed qualitatively based on the following eight criteria: evidence of social awareness, level of applicability to project, accuracy, and completeness of framework implementation, increased mastery of appropriate terminology, ability to be critical of their projects for the sake of improving social impacts, goal and scope explanation, inventory analysis explanation and interpretation of results explanation. For each report, a qualitative score was given as either poor, acceptable, or excellent based on the rubric shown in Additional file 3. An inter-rater agreement analysis was performed by evaluating the percentage agreement between two raters. The goal of the inter-rater analysis was to evaluate the robustness of the qualitative assessment and ensure scientific repeatability. Both raters were graduate-level engineering researchers with expertise in qualitative and mixed methods research. The first rater coded all the data using the rubric in Additional file 3. The second rater independently coded a randomly selected 25% of the data. Their agreement was checked by comparing the percentage of matching scores for the shared dataset. A high agreement between the two raters indicates that the qualitative assessment measurement is robust and can be trusted as unbiased by the rater’s judgment.

The capstone report sections on S-LCA were assessed for seven capstone student groups. The criteria used for the qualitative evaluation aim to capture the ability of the students to apply the provided reference template and reference documents and to thoroughly explain the importance of each assessment stage. By doing this qualitative assessment, it is expected to identify the areas in which the students excelled. As described earlier, an inter-rater agreement analysis was conducted to verify the robustness of the qualitative assessment criteria for the capstone reports. The results show an overall agreement of 76% among the two raters, which is considered a moderate-to-strong agreement.

S-LCA framework and implementation

After incorporating the expert user and novice user feedback, the resultant S-LCA framework is presented in Additional file 4. The implementation of the framework is explained in detail in the following section.

How is the framework implemented?

Goal and scope

The objective of the goal and scope stage is to define why the study is being performed and what is included in the analysis. Table 4 shows the recommended template to summarize the information for the goal and scope stage of the analysis. The summary should define the motivation for performing the study and a definition of the system boundaries. For this stage of the analysis, the framework proposes to classify the analysis as either informative, comparative, or enhancement type. This classification is adopted from the work of Kjaer, et al. [61] on evaluating the environmental impact of product–service systems (PSS). The informative analysis is used when the analysis aims at understanding the potential social impacts of a single product system. In a comparative assessment, either various concepts of the same product are being compared or different products with similar functionality are being compared. In an enhancement analysis, numerous iterations of the same product are compared, where each of the changes aims at improving the social impacts of the product. In the goal and scope, the user also defines if the analysis included company conduct metrics in the evaluation and the desired level of detail of the analysis. An initial definition of the system boundaries is provided by stating the life-cycle stages considered and the associated activities for each life-cycle stage and the stakeholder groups considered.

Table 4 Goal and scope information template
Inventory analysis

The objective of the inventory analysis is to define the selection of the indicators used in the analysis. The selection of indicators must match the goal and scope of the analysis. As part of the systematic mapping procedure [26], a database of indicators was created (see Additional file 1). This indicator set is used as the starting point of the inventory analysis. The steps described below are followed to select the list of indicators for this analysis:

  1. 1.

    Refer to the indicator database shown in Additional file 1 of the supplementary material.

  2. 2.

    Select relevant indicators based on the goal and scope of the case study.

    1. a.

      For each indicator, identify the following:

      • Indicator name

      • Indicator type: quantitative, semi-quantitative, or qualitative

      • Desired direction or direction of positive social impact: positive or negative

      • Data collection method for the indicator: primary (directly from the source) or secondary (from indirect sources)

      • The scale of the indicator: State, region, industry sector, or company

      • Social impact category as per the Guidelines of Social Assessment of Products from United Nations Environmental Program (UNEP) [50]

        • If a new social impact category is desired, provide enough detail for the reader to understand why it is necessary.

      • Stakeholder group(s) as per the Guidelines of Social Assessment of Products from United Nations Environmental Program (UNEP) [50]

        • If a new stakeholder group category is desired, please provide enough detail for the reader to understand why it is necessary.

      • Source of indicator

    2. b.

      Perform indicator data quality assessment using the modified matrix method provided in the framework.

    3. c.

      Update a list of indicators based on the results of the data quality assessment.

    4. d.

      (Optional) Benchmark list of indicators using stakeholder input.

      1. a.

        When there is access to the stakeholders and when performing a high-detail analysis, use stakeholder input data to validate the list of indicators used in the analysis.

  3. 3.

    Define the performance reference points (PRPs) used for the quantitative indicators.

Instead of defining the PRPs for the quantitative indicators listed on the indicator database (see Additional file 1), the authors recommend that these are defined based on the goal and study of the analysis. Although one might believe that defining a universal set of PRPs would provide robustness and standardization to the assessment, social impacts depend highly on local factors and the characteristics of the assessment being performed. This is the reason why instead of providing the PRPs values for the indicators, we provide a methodology to determine those PRP. For the indicators used in the analysis shown in this document, the authors have provided a list of the PRPs in Additional file 2. Additional file 2 is a spreadsheet file where each sheet represents a portion of the assessment. Different sheets show the indicators used in the assessment and define how the PRPs were calculated for each quantitative indicator. The file also shows the normalization approached used for each indicator.

The next step is to perform a data quality assessment of the data collected for each of the indicators. Additional file 5 shows the data quality matrix assessment method recommended in this framework. The method is based on the data quality assessment presented in the 2018 Handbook for the Social Impact Assessment of Products [58] and the Pedigree matrix method [62]. Each column represents the criteria used in the assessment, while each row provides the criteria needed to assign the data quality score. The scores range from 1 (best) to 5 (worst). The assessment is based on the following four criteria: (1) accuracy; integrity, and validity; (2) timeliness or temporal correlation; (3) geographical correlation, and (4) technological correlation. Accuracy, integrity, and validity related to the sources of the data, the acquisition methods used to gather the data, and the verification procedures used to collect the data [58, 62]. Timeliness or temporal correlation refers to the time correlation between the time of the study and the time of collection of the data [62]. Geographical correlation refers to the correlation between the area under study and the area of the collected data [58, 62]. Technological correlation refers to aspects of the enterprises, industries, and/or characteristics between the technology or product under study and the collected data [58, 62]. As stated by Weidema et al. [62, 63], it is important to see how each of the data quality indicators is assessing an independent aspect of data quality. In addition to assessing the data quality of the collected data, the results of the data quality matrix method should highlight the possibilities of improving the quality of the data being collected by evaluating the results for each of the data quality indicators. The resulting average score for each indicator value must be less than 3 to pass the quality assessment test.

Impact assessment

The objective of the impact assessment stage is to provide meaning to the list of indicators created in the inventory analysis section. The first step is to define performance reference points (PRP) for the quantitative indicators. PRPs are defined as “threshold values used to provide meaning to the quantitative data. They provide a reference from which to quantify the impact of the quantitative indicators” [1]. The reader should refer to Additional file 2 where the definition for each PRP is provided. The impact assessment consists of qualitative, semi-quantitative, and quantitative indicators. All values are normalized to a scale between 0 and 1, where 0 represents the lowest social performance and 1 represents the best social performance. Because the final indicator values are assumed to represent positive social performance, the normalization procedure for indicators with different directions of improvement must consider such a difference. For quantitative indicators where the desired direction is positive, the range between the minimum and maximum reference values are used to normalize the quantitative indicator:

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {{\text{Indicator}} - {\text{PRP}}\_{\text{min}}} \right)}}{{({\text{PRP}}\_\max - {\text{ PRP}}\_\min )}}.$$
(1)

If the desired direction of the indicator is negative, we need to modify our normalization equation so that the final indicator value aligns with the overall scale of 0 being the lowest social performance and 1 being the best social performance. To do this, we subtract the resulting value of Eq. (1) from 1 as follows:

$${\text{Indicator}}_{{{\text{norm}}}} = 1 - \frac{{{\text{Indicator}} - {\text{PRP}}\_{\text{min}}}}{{({\text{PRP}}\_\max - {\text{ PRP}}_{\min )} }}.$$
(2)

There are two types of semi-quantitative indicators used in the framework, yes or no questions and Likert scale-type questions with values between 1 and 5. To quantify yes and no questions, a yes is equal to a value of 1, and a no is equal to a value of 0. For Likert-type questions, the normalization also depends on the direction of improvement of the indicator. For an indicator where the desired direction of improvement is positive (5 represents the best social performance and 1 represents the worst social performance), the normalization procedure is the following:

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {{\text{Indicator}} - 1} \right)}}{4}.$$
(3)

For an indicator where the desired direction of improvement is negative (1 represents the best social performance and 5 represents the worst social performance), the normalization procedure is the following:

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {5 - {\text{Indicator}}} \right)}}{4}.$$
(4)

As with semi-quantitative and quantitative indicators, the results are normalized between 0 (worst social performance) and 1 (best social performance). Table 5 shows the recommended quantification procedure adopted from the Product Social Impact Assessment (PSIA) framework [63]. Instead of using values that range from − 2 to + 2, our framework proposes the use of values between 0 and 1 to be consistent with the normalization procedure. The quantification is based on the performance of the qualitative indicator relative to the PRP, where 1 represents the ideal or desired performance and 0 represents the worst performance for that qualitative indicator.

Table 5 Quantification of qualitative indicators
Interpretation of results

The objective of the interpretation of results stage is to identify the greatest contributors to social impacts and to propose changes that improve them. This stage consists of summarizing the main learnings from the analysis. The strategy used in summarizing and communicating the results should align with the desired question to be answered by performing the study. In other words, the interpretation of results should align with the goal and scope definition of the analysis. Instead of aggregating the results of indicators, the recommended strategy is to interpret each indicator individually; in addition to providing a numerical result, a narrative of the results obtained in the analysis should be provided. The aim of recommending a narrative is to provide a complete interpretation of the results, which may not be clear from only stating a numerical value.

When aggregation is used, it should align with the type of analysis being performed. When performing an informative study, no aggregation is recommended as the goal of the analysis is to understand the potential impacts of a single product system. When performing a comparative or enhancement type of study, the goal is to compare the social impacts among different alternatives. In this type of study, aggregation is only recommended to facilitate the comparison among different alternatives rather than to conclude social impacts. Aggregation may also facilitate comparison among different stakeholder groups or among different product life-cycle stages, which again is only recommended to facilitate comparisons. Regardless of the aggregation strategy implemented, the aim is to select a strategy that aligns with the goal and scope of the analysis.

Rooftop solar panel case study

The last step of the framework development process is to perform a case study application. Not only is this step important for validation of the framework, but also to identify any additional changes that must be done to enhance it. The case study presented involves the application of a challenge-derived S-LCA framework to perform a social impact assessment of a rooftop solar panel in the south-eastern region of the continental U.S.

The analysis presented in this study is informative, as it provides the results for a single product system. The case study analysis presented in this paper is a low-detail analysis based on secondary data sources where the goal is to gain an initial understanding of the greatest areas of concern for social impacts. A common approach is to perform a follow-up analysis where those areas of greatest concern are the focus of a subsequent high-detail analysis based on primary data rather than secondary data as it will provide a more complete representation of the processes included in the analysis.

Results and discussion

This section shows the results of implementing the framework to evaluate the social impacts of rooftop solar panels. As stated earlier, the case study application is the last step of the framework development process as it is considered more of a validation of the framework.

Goal and scope

The first step of the framework is to provide a justification of the study and what is included in the analysis. Table 6 shows the goal and scope information for the rooftop solar panel case study based on the template provided in the framework. Each line of Table 6 provides important information about the study design. The table is divided into three sections, the goal, the scope, and the initial system boundaries. As part of the goal section, the analyst states the objective of the study, which is the justification of why this study is valuable. The processes considered in the analysis are defined early in the study design process because this affects subsequent definitions such as the definition of the system boundary, and the analysis type. The next line defines if company conduct is included in the analysis. Social impact studies may include impacts related to company policies and decisions, which may affect individuals internal and external to the community. In the study presented, company conduct is not considered as part of the goal of the study because it is performed at early stages, where the companies to be performing any type of work are not defined. The level of detail of the analysis is defined next. Here, the analyst states the level of detail of the study, which then imposes requirements on the type of data to be used for the impact assessment. The social impact analysis shown in this paper of the rooftop solar panels is a subset of a bigger study that aims to understand the social implications of numerous renewable energy methods for the state of Georgia [64]. Even though it is always of benefit to consider the complete product life cycle when evaluating a technology, the focus of the current analysis is to identify a subset of the three technologies that provide the best social impact results. Since this is considered more of a screening phase at the early stages of the technology selection process, the scope of the analysis is on the use and end-of-life stages. The next step of the process is to perform a detailed analysis of the three technologies with the most potential, in which the full life cycle of each technology is considered. The next line in Table 6 provides a summarized but more detailed justification of why the study is being performed and its importance. The analyst then defines how many products are considered in the study. In this case study, a single product is analyzed, which is the residential rooftop solar panel. If more products were included in the analysis, this is where it would be stated. For the sake of clarity, the last line in the goal section defines the product functionality. This is important when selecting several different products that have similar functionality, perhaps to compare the social impact of each product.

Table 6 Goal and scope information for rooftop solar panel case study

The next section in Table 6 is the scope of the study, where the spatial scale of the study is defined along with the analysis type. The analysis presented is focused on the southeast region of the U.S. Defining a spatial scale is important when evaluating the ability of the results of a study to be applied to other regions. Before extending the learnings from this study to other regions, readers should consider comparing relevant characteristics such as geography demographics before and look for similarities that may justify adopting such learnings. The next line defines the study type of the case study, which is of informative type. The reader is referred to “Data and methods” of this article for an explanation of the different study types.

The last section of Table 6 defines the initial system boundaries of the case study. The system boundaries are defined as initial because it is expected that as the study progresses and more information is analyzed, the system boundaries may need to be re-defined. The first two lines of this section define the system boundaries and the processes considered in the analysis. Figure 1 shows the system boundaries defined for the study. As previously stated, the system boundaries include the use and end-of-life phases and exclude the production and processing stages. The decision to exclude the production and processing is based on the geographical scope of this study, as it only aims to understand the local impacts to individuals in Georgia, and none of the components used in the residential rooftop solar panels considered for the study have production or processing activities occurring in that state.

Fig. 1
figure 1

Initial system boundaries defined for rooftop solar panel assessment

The last part of Table 6 defines the stakeholder groups considered in the analysis. The stakeholder groups considered in the analysis are the consumers, the local community, society, and the workers. The consumers are selected because they might be affected in numerous aspects, such as price changes and real estate and tax value changes. The local community stakeholder group to consider impacts on land, increased use of natural resources, or even the public perception of the technology. By including the society stakeholder group, the study aims to evaluate impacts at the economy and policy development levels that the technology might have. The last stakeholder group considered in the analysis are the workers involved in the use and end-of-life stages. In this category, the study aims to evaluate social impacts not only on those involved in placing the rooftop panels in their locations, but those workers involved in the disposal and/or recycling processes.

Inventory analysis

The inventory analysis involves the selection of the indicators used in the impact assessment. Even though there are qualitative and quantitative methodologies to establish agreement among the selection of the indicators used in the analysis, there are many factors that may affect the list of indicators used to perform the assessment. One important aspect is for the indicators to match the goal and scope of the study. Although the lack of a universal set of indicators is also criticized, the breadth of applications of S-LCA makes it difficult to have a single set of indicators that would apply to all scenarios [36]. The S-LCA framework used in this assessment contains a database of impact indicators developed through a systematic mapping procedure. The reader is directed to the work of [26] for a complete explanation of the systematic mapping procedure. Please refer to the indicator database file provided in Additional file 1.

Indicator selection

The indicators selected for the analysis should be tabulated and organized as shown in Table 7: List of selected indicators to increase the ability of readers to learn from the existing analysis. Because there is not a universal set of indicator names, it is important to provide additional information for the indicators to improve objectivity in the interpretation of the results. In this framework, in addition to the name, the indicator type, its impact category, its stakeholder group, and the desired direction of the values are all provided to improve transparency when communicating the results.

Table 7 List of selected indicators

Table 7 summarizes the indicators used in the residential rooftop panel analysis. The indicators are selected from the indicator database provided in the S-LCA framework (see Additional file 1). The selection of the indicators is based on how well they align with the goal and scope definition of the analysis, and the definition of the system boundaries. As shown in Table 7, each indicator has its numeric identification, its name, its type, its impact category, the stakeholder group it belongs to, and the desired direction or answer. It is important to provide this information for each indicator to allow for easier comparison among different studies. The desired direction or answer of each indicator is essential as the directionality is taken into consideration in the interpretation of the results. A desired direction or answer is used as a reference to determine what is a positive or negative social impact. The direction is also important during the impact assessment stage as this affects the approach used in the normalization process.

Perform data-quality assessment using matrix method

An assessment of the quality of the data is an important step in any social assessment because the quality of the learnings and conclusions obtained from the study all depends on the quality of the data being used for the assessment. The purpose of the data quality matrix is for the individuals performing the analysis to identify any instances where the data sources are not up to the required standard. Once this is identified, the individual performing the assessment can determine possible countermeasures, which is always preferred over having studies reach conclusions that will later be considered invalid due to using low-quality data sources. The decision to incorporate the pedigree matrix into the data quality assessment step is to “quantify the uncertainty resulting from less than perfect data” [65]. The authors recognize that access to high-quality data may be prohibitive for numerous reasons such as geographic limitations, political restrictions, or cost barriers, which is why having a methodology to identify shortcomings in data quality is so important. The pedigree matrix provides a holistic approach to data quality assessment by considering aspects of the dataset itself, such as reliability and completeness, but also its applicability with regard to time, geography, and technology. Therefore, the data quality assessment method recommended in this framework is highly influenced by the pedigree matrix method.

The results of the data quality matrix method should highlight the possibilities of improving the quality of the collected data by evaluating the results for each of the data quality results. The resulting average score value must be less than 3 to pass the quality assessment test. Table 8 shows the data quality results for each indicator used in the analysis. Although Table 8 shows all literature sources used for each indicator, the data quality scores shown in the table are only for those literature sources that have the lowest data quality score. The lowest values are on the timeliness column, which is typical for studies that rely on secondary data. Even though a study may receive a low score on the timeliness criteria, the study is still valuable if it has a higher score in the other quality criteria.

Table 8 Data quality values and corresponding literature sources for each indicator

Impact assessment

Define the performance reference points (PRPs) used for the quantitative indicators

The first step in the impact assessment stage is to define performance reference points (PRP) for the quantitative indicators. The selection of the PRPs should align with the goal and scope of the analysis. As an example, let us consider that the analysis aims to determine social impacts at the international level. In this case, the scale of the PRPs used should reflect international reference values. These PRPs would be different for an analysis that is focused on regional impacts, in which a set of regional or national PRPs would be more appropriate. In the case study presented, four out of the 24 indicators are quantitative.

Table 9 provides a summary of the PRPs used to characterize the quantitative indicators in the analysis along with their corresponding source. These values will be used as reference when calculating the normalized value for each of the indicators per Eqs. (1), (2), and (3). The next section shows the value of each indicator, along with its normalized value. Please refer to the example calculations shown in “Quantification and normalization of indicators” to understand how the normalized value is achieved for each indicator. The impact assessment consists of qualitative, semi-quantitative, and quantitative indicators. All values are normalized to a scale between 0 and 1, where 0 represents the lowest social performance and 1 represents the best social performance.

Table 9 Performance reference points (PRPs) for quantitative indicators

Quantification and normalization of indicators

Table 10 shows the normalized indicator values for each indicator. The indicator values are based on the information provided in each literature source shown in Table 8. Not all indicators can be disaggregated in the use and end-of-life phases, as most of them apply to both. For the reader to understand how the final value was calculated, the calculation procedure for three indicators, one quantitative, one semi-quantitative, and qualitative is shown below. An additional spreadsheet shows how the final value of each indicator is obtained (see Additional file 2). Because the indicators shown in Table 10 are obtained from the indicator database (see Additional file 1), some of these indicators were obtained from the UNEP/SETAC framework, specifically from the methodological sheets [8].

Table 10 Non-normalized and normalized indicator values for rooftop solar panel S-LCA

As an example, for a quantitative indicator calculation, we are using indicator #3 “Contribution of the technology to economic progress”. The quantification is based on the number of employees in the solar industry relative to electricity generation industries using the following energy sources: fossil fuel, nuclear, wind, combined heat and power (CHP), hydro, geothermal, and biomass, based on the 2019 United States Energy and Employment Report [73]. To do this calculation, values are substituted in Eq. (5):

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {{\text{Indicator}} - {\text{PRP}}\_{\text{min}}} \right)}}{{({\text{PRP}}\_\max - {\text{ PRP}}\_\min )}},$$
(5)
$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{149,343{\text{ jobs}} - 8,526{\text{ jobs}}}}{{211,469{\text{ jobs}} - 8,526{\text{ jobs}}}},{ }$$
$${\text{Indicator}}_{{{\text{norm}}}} = \frac{140,817}{{202,943}},{ }$$
$${\text{Indicator}}_{{{\text{norm}}}} = 0.069.{ }$$

There are two types of semi-quantitative indicators used in the analysis. The first type is a Yes or No type indicator, where the final value is either 0 or 1. In this indicator type, a 0 represents a mismatch between the indicator value and the desired direction value, and 1 indicates an agreement between the indicator value and the desired direction value. Using indicator #1 “Child involvement in any life-cycle activity”, we see that the indicator value is “No” and the desired response is also “No”, so the final indicator value is 1. The second type of semi-quantitative indicator value is a Likert-type scale, with values between 1 and 5. The final value of the indicator depends on the desired direction of the indicator, so we will provide one example for each direction below. For indicator #2 “Community/trust approval in technology risk information”, the desired result of 5 means that the community has complete trust in the risk information associated. Because our indicator value is 5, the final value based on Eq. (6) is then:

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {{\text{Indicator}} - 1} \right)}}{4},$$
(6)
$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {5 - 1} \right)}}{4},$$
$${\text{Indicator}}_{{{\text{norm}}}} = 1.$$

If for example, we take indicator #6 “Health hazards from emissions during any life-cycle activity”, the desired value is 1, as we want the lowest number of emissions. In our analysis, the indicator value is 5, so our final value is now calculated using Eq. (7):

$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {5 - {\text{Indicator}}} \right)}}{4},$$
(7)
$${\text{Indicator}}_{{{\text{norm}}}} = \frac{{\left( {5 - 5} \right)}}{4},$$
$${\text{Indicator}}_{{{\text{norm}}}} = 0.$$

For a qualitative indicator type, the quantification is done by comparing the indicator value with the PRP and assigning a value between 0 and 1 from Table 5. Because no qualitative indicators are used in the analysis presented in this manuscript, a fictional example from the PSIA handbook is used as a reference [63]. The example provided in the PSIA handbook is fictional, but it conveys the application of the quantification for an occupational health and safety (OHS) application. “A supplier produces cotton in India. The hotspot identification phase reveals potential risks regarding OHS. When the company is approached it can send an audited report, showing the efforts of the company to improve this, while the secondary country data show the situation is not on a generally acceptable level. Without such data this supplier would score − 2, if the data are provided and deemed to be credible evidence, it can be scored − 1. If the company can show that it has achieved an acceptable OHS performance or is accepted by a credible certification standard that covers the OHS performance, it can be scored a level 0. Further evidence may show even better performance, which could merit a score of + 1 or + 2” [63]. When quantifying qualitative indicators, the reader must clearly state the reasoning behind the quantitative value given to the indicator for transparency and communication purposes.

An additional step would be to weigh the final indicator values. Value weighting is not recommended for informative, low-detail studies, such as the rooftop solar panel case study shown in this article because their objective is to provide an initial understanding of the system being studied using secondary data. Weighting is recommended at more detailed stages of the analysis, during which there is access to primary data that can be used to justify the weighting values. The reader is directed to the additional indicator file where all the indicator quantification and final values are shown in detail (see Additional file 2).

Interpretation of results

The objective of the interpretation of results stage is to identify the greatest contributors to social impacts and to propose changes to improve such impacts. The results summarized in Table 10 are categorized by stakeholder group to assist the researcher in identifying the stakeholder groups that are most affected by the implementation of the rooftop solar panels. Although S-LCAs are mostly performed to understand the impacts of product systems on stakeholders, there are instances in which the goal of the assessment is to identify the life-cycle stages that result in the greatest social impact. In such a case, it would be appropriate to organize the indicators by life-cycle stage rather than by stakeholder group. The strategy to organize the indicators should match the objective of the analysis.

Table 11 shows the individual and aggregated S-LCA results for each stakeholder group. The aggregation procedure consists of an arithmetic average for all indicators in each stakeholder group. In this case study, one of the key motivations is to identify the most affected stakeholder groups. If the analysis aims to identify the life-cycle activities that result in the greatest social impact, the aggregation of the results should be done at the life-cycle stage level. When adopting this approach, the analysis must ensure that the indicators selected in the analysis are differentiable among different life-cycle stages. For example, an indicator may read “child labor involved in any life-cycle stage of the product”. This indicator is not appropriate to aggregate per each life-cycle stage. The indicator should instead be written as: “child labor involved in the production life-cycle stage” or “child labor involved in the end-of-life stage”. The purpose of providing the individual indicator values along with average values in Table 11 is for the reader to identify those indicators that provide the greatest negative impact, as the average value only reduces the interpretation of a group of indicators to a single value. Based on the results, the worst socially impacted group are the workers, with a value of 0.50.

Table 11 Normalized results for each stakeholder group

For the consumers, local community, and society stakeholder groups, the results vary only slightly. Based on the values of 0.60, 0.67, and 0.62, respectively, the analyst may be tempted to establish conclusions on which of these stakeholder groups are more affected. Instead, two important learnings result. First, a more detailed study is recommended as a follow-up to the initial, informative study results presented in this case study. Preparing a highly detailed study, where primary data are collected for each indicator should provide a clearer explanation of any significant differences among these stakeholder groups. Secondly, we recommend that a weighting scheme be applied to the results. These weights should be selected to give more importance to those indicators that are most aligned with the goal of the study, and which will better highlight those processes that must be modified to reduced social impacts.

In the assessment presented in this paper, the case study is informative in type, meaning that no comparison is performed among different products or technologies. In this type of analysis, the use of aggregation is discouraged unless it is necessary to clarify the goal and scope of the analysis. Comparing aggregated results for the different stakeholder groups as shown may be misleading because the number and type of indicators used for each stakeholder group are different, adding variability to the analysis. Rather, it is recommended to analyze the results of the indicators individually. In the following sections, the results for each stakeholder group are shown individually. Each indicator is provided with a narrative to aid in the understanding of the given score. Providing a narrative that briefly explains the score of the indicator could go a long way in improving the transparency of the quantification of indicators, especially those whose value is not completely obtained from a quantitative procedure.

Stakeholder group: consumers

Table 12 shows the results for the consumer stakeholder group. The greatest social impacts resulting from the inability of consumers to choose the utility company that owns the technology. Assuming that the quality of housing infrastructure will be able to handle the structural load of the panels, they seem to provide good social performance for the consumers. The ability of solar panel components to be reused for other purposes rather than been disposed of in a landfill provides consumers the option of responsibly disposing of solar panel components when no longer useful.

Table 12 Normalized results for the consumer stakeholder group

Stakeholder group: local community

Table 13 shows the results for the community stakeholder group. Solar rooftop technology does not suffer from community resistance or backlash, because it is seen as a green energy-producing technology. For the state of Georgia, the technology takes advantage of the local expert personnel and workers in the technology [133, 134]. The greatest social impacts are expected in the form of gentrification, due to increased real estate values seen in houses that use solar panels. In addition, access to rooftop solar panels is prohibitive to low-income members of the community. Overall, the technology is seen as a positive and environmentally friendly energy generating solution. Its public acceptance along all socio-economic sectors relies on the implementation of programs for low-income members of communities.

Table 13 Normalized results for the local community stakeholder group

Stakeholder group: society

Table 14 shows the results for the society stakeholder group. The solar industry is booming in Georgia and in the U.S. As of 2019, the technology has contributed over $17 billion to the U.S. economy and employs more than 200,000 workers in the U.S. [72]. The state of Georgia also enjoys significant irradiation levels relative to the rest of the U.S., which makes it a potential energy source for most of the state. Numerous federal and local programs to assist individuals in the implementation of rooftop solar technologies were identified, which is positive. At a local level, the city of Atlanta has committed to providing 100% clean energy to its municipal operations by the year 2035.

Table 14 Normalized results for the society stakeholder group

Stakeholder group: workers

Table 15 shows the results for the worker stakeholder group. The worst social performance is found for the worker stakeholder group. Solar panels involve the use of precious metals that are linked to child labor practices during the extraction and processing life-cycle stages. Because the scope of the assessment is limited to the use and end-of-life stages, both of which happen in the state of Georgia and the U.S., this is not considered in the analysis. A significant issue with rooftop solar panels is the risk they pose to workers installing the panels on rooftops. In addition, recycling and disposing of solar panels present numerous health hazards to workers if these processes are not properly completed. The recycling and processing of the electronics used in solar panels involve toxic fumes that are detrimental to human health [78]. The success of end-of-life treatment relies on a proper disposal infrastructure, instead of relying on disposing of the panels in landfills [88].

Table 15 Normalized results for the workers' stakeholder group

Overall learnings and recommendations

No aggregation is recommended for informative studies

In an informative S-LCA, the goal is to understand the social impacts of the product or technology being studied. Because there is no comparison among different concepts or products, it is recommended to analyze the indicators individually. Numerous frameworks perform aggregation of the results at multiple levels of the analysis, at the social impact category level, the product life-cycle level, or the stakeholder group level. Aggregation is beneficial when comparing products that have similar functionality because it provides an easy and quick way to compare using a single number. In an informative type of study, aggregation may result in a loss of information, as it reduces the impact of multiple indicators into a single number. Although analysts with a technical background may prefer using a single number to communicate the social impact performance of a product, it is recommended to provide the numerical performance of each indicator along with a narrative of the result. This should provide a more holistic result than just providing a single number.

In a comparative type of analysis, aggregation is recommended to facilitate comparison among different production systems that provide similar functionality. In comparative type analyses, the user must develop a set of indicators that apply to all product systems being evaluated, allowing the individual assessments to be based on the same set of metrics. The aggregation of metrics should be done at a level that aligns with the goal of the analysis; the choice to aggregate the metrics at the product life-cycle level, the stakeholder group level, or the social impact category levels should clearly show what the best product systems are for the application.

The aggregation strategy for an enhancement study is similar to that of a comparative type of analysis because different concepts of the same product are being evaluated. The same set of indicators should be used to evaluate each product system concept, and the aggregation level should align with the goal and scope of the analysis. As in informative studies, the use of an indicator narrative is recommended to aid the reader in understanding the results.

Establish aggregation strategy before finalizing the set of indicators

Depending on the goal and scope of the analysis, the user may wish to aggregate the social impact results at the product life-cycle level or social impact category level. It is recommended to establish this during the goal and scope stage. This information is then used in the inventory analysis stage to develop the final set of indicators. Doing so allows the user to express the indicators in a way that matches the aggregation strategy.

Limitations

One limitation of the case study is that only one product is analyzed. A deeper evaluation plan should consider products from multiple industries and functionalities to help detect additional opportunities for improvement in the framework. Another limitation of the case study presented is that it is a low-detail type of analysis. This analysis is performed to gain an initial understanding of the product system being evaluated and to identify the areas of the analysis that should receive more attention in subsequent analysis. Being a low-detail analysis, the data sources used are secondary, which do not include primary data from the stakeholder groups evaluated. The results of this low-detail analysis highlight the workers' stakeholder group as being the highest socially impacted. A subsequent, higher detail analysis should focus on gathering primary data from this group to better understand the source of their social impacts and to formulate recommendations to reduce such negative impacts.

Conclusions

The challenge-derived S-LCA framework has been applied to perform an informative analysis of the social impacts of a rooftop solar panel. It must be clarified that the rooftop solar panel case study analysis shown in this paper is a low-detail type of analysis that is performed using only secondary data. Having said this, there are steps within the framework that are not shown in this rooftop solar panel case study analysis, such as the benchmarking of indicators using stakeholder input or the collection of primary data. The case study is scoped to the use and end-of-life phases of the life cycle, and its geographical focus is on the state of Georgia in the United States of America. The framework allowed for the assessment of potential impacts of rooftop solar technologies in the state of Georgia. The biggest concerns in the technology implementation result from unwanted displacement (gentrification) due to increased real estate prices, the inequity of access to the technology for low-income community members, the dangers it poses to workers installing the solar panels, and the necessity of a suitable recycling infrastructure that ensures proper management of solar panel components at the end of their life.

A few recommendations are provided following the case study application that will be included in the next version of the challenge-derived S-LCA framework, on the comparison of different products or technologies (with similar functionality). During the inventory analysis stage, the user should choose a set of indicators that apply to all the products being evaluated. This means that during the goal and scope stage of the assessment, the user should define the study as a comparison study, and define the products or technologies being compared. This will ensure the use of a set of indicators that apply to all the products being assessed, rather than using different sets of indicators for each of the products being analyzed.

Overall, the objectives of the case study application were achieved. The use of the framework provided social impact information and areas of concern for potential social impacts, where efforts should focus if this technology is to be implemented. The case study application highlighted areas of improvement for the framework that will be modified accordingly as the framework evolves.

Availability of data and materials

Not applicable.

Abbreviations

CEO:

Colorado Energy Office

ISO:

International Standards Organization

kWh:

Kilowatt-hour

LCSA:

Lifecycle sustainability assessment

PV:

Photovoltaic

PACE:

Property assessed clean energy

PRP:

Performance reference point

S-LCA:

Social lifecycle assessment

SETAC:

Society of Environmental Toxicology and Chemistry

SHS:

Solar home systems

SHDB:

Social hotspot database

UNEP:

United Nations Environmental Program

U.S.:

United States

TWh:

Terawatt-hour

References

  1. Benoit C, Mazijn B (2009) Guidelines for social life cycle assessment of products. Life cycle initiative, UNEP-SETAC. Retrieved 16 October 2013

  2. Spierling S, Knüpffer E, Behnsen H et al (2018) Bio-based plastics—a review of environmental, social and economic impact assessments. J Clean Prod 185:476–491. https://doi.org/10.1016/j.jclepro.2018.03.014

    Article  Google Scholar 

  3. Kühnen M, Hahn R (2017) Indicators in social life cycle assessment: a review of frameworks, theories, and empirical experience. J Ind Ecol 21:1547–1565. https://doi.org/10.1111/jiec.12663

    Article  Google Scholar 

  4. Di Cesare S, Silveri F, Sala S, Petti L (2018) Positive impacts in social life cycle assessment: state of the art and the way forward. Int J Life Cycle Assess 23:406–421. https://doi.org/10.1007/s11367-016-1169-7

    Article  Google Scholar 

  5. International Organization for Standardization (2006) Environmental management—Life cycle assessment—Requirements and guidelines ISO 14040. Geneva

  6. Lucchetti MCMC, Arcese G, Traverso M, Montauti C (2018) S-LCA applications: a case studies analysis. E3S Web Conf 74:1–7. https://doi.org/10.1051/e3sconf/20187410009

    Article  Google Scholar 

  7. Petti L, Serreli M, Di Cesare S (2018) Systematic literature review in social life cycle assessment. Int J Life Cycle Assess 23:422–431. https://doi.org/10.1007/s11367-016-1135-4

    Article  Google Scholar 

  8. Benoît-Norris C, Vickery-Niederman G, Valdivia S et al (2011) Introducing the UNEP/SETAC methodological sheets for subcategories of social LCA. Int J Life Cycle Assess 16:682–690. https://doi.org/10.1007/s11367-011-0301-y

    Article  Google Scholar 

  9. American Public Health Association (2018) The public health impact of energy policy: policy statement 7825:1–9

    Google Scholar 

  10. Rodríguez-Serrano I, Caldés N, de la Rúa C, Lechón Y (2017) Assessing the three sustainability pillars through the Framework for Integrated Sustainability Assessment (FISA): case study of a Solar Thermal Electricity project in Mexico. J Clean Prod 149:1127–1143. https://doi.org/10.1016/j.jclepro.2017.02.179

    Article  Google Scholar 

  11. Mcquate S (2019) Emissions from electricity generation lead to disproportionate number of premature deaths for some racial groups. In: Univ. Washington. https://www.washington.edu/news/2019/11/20/electricity-generation-emissions-premature-deaths/. Accessed 2 Mar 2021

  12. Southern Environmental Law Center’s Solar Initiative (2017) The Environmental Review of Solar Farms in the Southeast U.S.: Maximizing Benefits and Minimizing Impacts to Drive Smart, Sustainable Development of Solar Power. Chapel Hill, NC

  13. Kerry Thoubboron (2018) Are solar panels toxic to the environment? https://news.energysage.com/solar-panels-toxic-environment/. Accessed 2 Mar 2021

  14. Fedorova E, Pongrácz E (2019) Cumulative social effect assessment framework to evaluate the accumulation of social sustainability benefits of regional bioenergy value chains. Renew Energy 131:1073–1088. https://doi.org/10.1016/j.renene.2018.07.070

    Article  Google Scholar 

  15. Schlör H, Venghaus S, Zapp P et al (2018) The energy-mineral-society nexus—a social LCA model. Appl Energy 228:999–1008. https://doi.org/10.1016/j.apenergy.2018.06.048

    Article  Google Scholar 

  16. Holger S, Jan K, Petra Z et al (2017) The social footprint of hydrogen production—a social life cycle assessment (S-LCA) of alkaline water electrolysis. Energy Procedia 105:3038–3044. https://doi.org/10.1016/j.egypro.2017.03.626

    Article  Google Scholar 

  17. Wang Z, Osseweijer P, Duque JP (2018) Assessing social sustainability for biofuel supply chains: The case of aviation biofuel in Brazil. 2017 IEEE Conf Technol Sustain SusTech 2017 2018:1–5. https://doi.org/10.1109/SusTech.2017.8333474

  18. Corona B, Bozhilova-Kisheva KP, Olsen SI, San Miguel G (2017) Social life cycle assessment of a concentrated solar power plant in Spain: a methodological proposal. J Ind Ecol 21:1566–1577. https://doi.org/10.1111/jiec.12541

    Article  Google Scholar 

  19. Kabir E, Kim KH, Szulejko JE (2017) Social impacts of solar home systems in rural areas: a case study in Bangladesh. Energies 10:1–12. https://doi.org/10.3390/en10101615

    Article  Google Scholar 

  20. Traverso M, Asdrubali F, Francia A, Finkbeiner M (2012) Towards life cycle sustainability assessment: an implementation to photovoltaic modules. Int J Life Cycle Assess 17:1068–1079. https://doi.org/10.1007/s11367-012-0433-8

    Article  Google Scholar 

  21. Dubey S, Jadhav NY, Zakirova B (2013) Socio-economic and environmental impacts of silicon based photovoltaic (PV) technologies. Energy Procedia 33:322–334. https://doi.org/10.1016/j.egypro.2013.05.073

    Article  Google Scholar 

  22. Suuronen A (2017) Ecological and social impacts of photovoltaic solar power plants and optimization of their locations in northern Chile. Jyväskylä studies in biological and environmental science. 338

  23. International Organization for Standardization (2006) ISO 14040-Environmental management—Life Cycle Assessment—Principles and Framework. Int Organ Stand 3:20. https://doi.org/10.1016/j.ecolind.2011.01.007

    Article  Google Scholar 

  24. U.S. Energy Information Administration (2021) Electricity Consumption by Country. In: International. https://www.eia.gov/international/data/world/electricity/electricity-consumption?pd=2&p=0000002&u=0&f=A&v=mapbubble&a=-&i=none&vo=value&&t=C&g=00000000000000000000000000000000000000000000000001&l=249-ruvvvvvfvtvnvv1vrvvvvfvvvvvvfvvvou20evvvvvvvvvvnvvvs000. Accessed 2 Mar 2021

  25. International Renewable Energy Agency (2017) Renewable Energy Benefits: Understanding the Socio-Economics. 1–16

  26. Bonilla-Alicea RJ, Fu K (2019) Systematic map of the social impact assessment field. Sustainability 11:4106

    Article  Google Scholar 

  27. World Commission (1987) Report of the World Commission on Environment and Development : Our Common Future Acronyms and Note on Terminology Chairman’ s Foreword

  28. James KL, Randall NP, Haddaway NR (2016) A methodology for systematic mapping in environmental sciences. Environ Evid 5:7. https://doi.org/10.1186/s13750-016-0059-6

    Article  Google Scholar 

  29. Peruzzini M, Gregori F, Luzi A et al (2017) A social life cycle assessment methodology for smart manufacturing: the case of study of a kitchen sink. J Ind Inf Integr 7:24–32. https://doi.org/10.1016/j.jii.2017.04.001

    Article  Google Scholar 

  30. Grubert E (2018) Rigor in social life cycle assessment: improving the scientific grounding of SLCA. Int J Life Cycle Assess 23:481–491. https://doi.org/10.1007/s11367-016-1117-6

    Article  Google Scholar 

  31. Eren Y, Alev G, Arif M (2019) Environmental and social life cycle sustainability assessment of different packaging waste collection systems. Resour Conserv Recycl 143:119–132. https://doi.org/10.1016/j.resconrec.2018.12.028

    Article  Google Scholar 

  32. Siebert A, Bezama A, O’Keeffe S, Thrän D (2018) Social life cycle assessment: in pursuit of a framework for assessing wood-based products from bioeconomy regions in Germany. Int J Life Cycle Assess 23:651–662. https://doi.org/10.1007/s11367-016-1066-0

    Article  Google Scholar 

  33. Norris CB, Aulisio D, Norris GA (2012) Working with the Social Hotspots Database—Methodology and Findings from 7 Social Scoping Assessments. Leveraging Technol a Sustain World 581–586. https://doi.org/10.1007/978-3-642-29069-5_98

  34. Siebert A, O’Keeffe S, Bezama A et al (2018) How not to compare apples and oranges: generate context-specific performance reference points for a social life cycle assessment model. J Clean Prod 198:587–600. https://doi.org/10.1016/j.jclepro.2018.06.298

    Article  Google Scholar 

  35. Dubois-Iorgulescu AM, Saraiva AKEB, Valle R, Rodrigues LM (2018) How to define the system in social life cycle assessments? A critical review of the state of the art and identification of needed developments. Int J Life Cycle Assess 23:507–518. https://doi.org/10.1007/s11367-016-1181-y

    Article  Google Scholar 

  36. Arcese G, Lucchetti MC, Massa I, Valente C (2018) State of the art in S-LCA : integrating literature review and automatic text analysis. Int J Life Cycle Assess 23:394–405. https://doi.org/10.1007/s11367-016-1082-0

    Article  Google Scholar 

  37. Reap J, Roman F, Duncan S, Bras B (2008) A survey of unresolved problems in life cycle assessment. Int J Life Cycle Assess 13:290–300. https://doi.org/10.1007/s11367-008-0008-x

    Article  Google Scholar 

  38. Reap J, Roman F, Duncan S, Bras B (2008) A survey of unresolved problems in life cycle assessment. Int J Life Cycle Assess 13:374–388. https://doi.org/10.1007/s11367-008-0009-9

    Article  Google Scholar 

  39. Kjaer LL, Pagoropoulos A, Schmidt JH, McAloone TC (2016) Challenges when evaluating product/service-systems through life cycle assessment. J Clean Prod 120:95–104. https://doi.org/10.1016/j.jclepro.2016.01.048

    Article  Google Scholar 

  40. Reitinger C, Dumke M, Barosevcic M, Hillerbrand R (2011) A conceptual framework for impact assessment within SLCA. Int J Life Cycle Assess 16:380–388. https://doi.org/10.1007/s11367-011-0265-y

    Article  Google Scholar 

  41. Bianchi A, Ginelli E (2018) The social dimension in energy landscapes. City, Territ Archit 5:9. https://doi.org/10.1186/s40410-018-0085-5

    Article  Google Scholar 

  42. Janker J, Mann S, Rist S (2019) Social sustainability in agriculture—a system-based framework. J Rural Stud 65:32–42. https://doi.org/10.1016/j.jrurstud.2018.12.010

    Article  Google Scholar 

  43. Hossain MU, Poon CS, Dong YH et al (2018) Development of social sustainability assessment method and a comparative case study on assessing recycled construction materials. Int J Life Cycle Assess 23:1654–1674. https://doi.org/10.1007/s11367-017-1373-0

    Article  Google Scholar 

  44. Gregori F, Papetti A, Pandolfi M et al (2017) Digital manufacturing systems: a framework to improve social sustainability of a production site. Procedia CIRP 63:436–442. https://doi.org/10.1016/j.procir.2017.03.113

    Article  Google Scholar 

  45. Sierra LA, Pellicer E, Yepes V (2017) Method for estimating the social sustainability of infrastructure projects. Environ Impact Assess Rev 65:41–53. https://doi.org/10.1016/j.eiar.2017.02.004

    Article  Google Scholar 

  46. Fortier M-OP, Teron L, Reames TG et al (2019) Introduction to evaluating energy justice across the life cycle: a social life cycle assessment approach. Appl Energy 236:211–219. https://doi.org/10.1016/j.apenergy.2018.11.022

    Article  Google Scholar 

  47. van der Velden NM, Vogtländer JG (2017) Monetisation of external socio-economic costs of industrial production: a social-LCA-based case of clothing production. J Clean Prod 153:320–330. https://doi.org/10.1016/j.jclepro.2017.03.161

    Article  Google Scholar 

  48. Wang S-W, Hsu C-W, Hu AH (2016) An analytic framework for social life cycle impact assessment—part 1: methodology. Int J Life Cycle Assess 21:1514–1528. https://doi.org/10.1007/s11367-016-1114-9

    Article  Google Scholar 

  49. Anaya FC, Espírito-Santo MM (2018) Protected areas and territorial exclusion of traditional communities: analyzing the social impacts of environmental compensation strategies in Brazil. Ecol Soc 23:art8. https://doi.org/10.5751/ES-09850-230108

    Article  Google Scholar 

  50. Benoît C, Norris GA, Valdivia S et al (2010) The guidelines for social life cycle assessment of products: just in time! Int J Life Cycle Assess 15:156–163. https://doi.org/10.1007/s11367-009-0147-8

    Article  Google Scholar 

  51. Rafiaani P, Kuppens T, Van DM et al (2018) Social sustainability assessments in the biobased economy: towards a systemic approach. Renew Sustain Energy Rev 82:1839–1853. https://doi.org/10.1016/j.rser.2017.06.118

    Article  Google Scholar 

  52. Arvidsson R, Hildenbrand J, Baumann H et al (2018) A method for human health impact assessment in social LCA: lessons from three case studies. Int J Life Cycle Assess 23:690–699. https://doi.org/10.1007/s11367-016-1116-7

    Article  Google Scholar 

  53. Russo Garrido S, Parent J, Beaulieu L, Revéret JP (2018) A literature review of type I SLCA—making the logic underlying methodological choices explicit. Int J Life Cycle Assess 23:432–444. https://doi.org/10.1007/s11367-016-1067-z

    Article  Google Scholar 

  54. Dunmade I, Udo M, Akintayo T et al (2018) Lifecycle impact assessment of an engineering project management process—a SLCA approach. IOP Conf Ser Mater Sci Eng 413:012061. https://doi.org/10.1088/1757-899X/413/1/012061

    Article  Google Scholar 

  55. Zanchi L, Delogu M, Zamagni A, Pierini M (2018) Analysis of the main elements affecting social LCA applications : challenges for the automotive sector. Int J Life Cycle Assess 23:519–535. https://doi.org/10.1007/s11367-016-1176-8

    Article  Google Scholar 

  56. Ekener E, Hansson J, Gustavsson M (2018) Addressing positive impacts in social LCA—discussing current and new approaches exemplified by the case of vehicle fuels. Int J Life Cycle Assess 23:556–568. https://doi.org/10.1007/s11367-016-1058-0

    Article  Google Scholar 

  57. Poverty Reduction Group (PRMPR) and Social Development Department (SDV) (2003) A User’s Guide to Poverty and Social Impact Analysis. Washington, D.C.

  58. Fontes J, Tarne P, Traverso M, Bernstein P (2018) Product social impact assessment. Int J Life Cycle Assess 23:547–555. https://doi.org/10.1007/s11367-016-1125-6

    Article  Google Scholar 

  59. Gaviglio A, Bertocchi M, Marescotti ME, et al (2016) The social pillar of sustainability: a quantitative approach at the farm level. Agric Food Econ 4. https://doi.org/10.1186/s40100-016-0059-4

  60. Nichols Applied Management Management and Economic Consultants (2016) Benga Mining Ltd. Grassy Mountain Coal Project Socio-Economic Impact Assessment. https://iaac-aeic.gc.ca/050/documents/p80101/103941E.pdf

  61. Kjaer LL, Pigosso DCA, McAloone TC, Birkved M (2018) Guidelines for evaluating the environmental performance of Product/Service-Systems through life cycle assessment. J Clean Prod 190:666–678. https://doi.org/10.1016/j.jclepro.2018.04.108

    Article  Google Scholar 

  62. Weidema BP, Wesnæs MS (1996) Data quality management for life cycle inventories—an example of using data quality indicators. J Clean Prod 4:167–174. https://doi.org/10.1016/S0959-6526(96)00043-1

    Article  Google Scholar 

  63. Goedkoop MJ, de Beer I, Harmens R, et al (2020) Product Social Impact Assessment Handbook. Amersfoort: Pré Consultancy

  64. Ray C. Anderson Foundation (2021) Georgia Drawdown. https://www.drawdownga.org/. Accessed 2 Jan 2020

  65. Pedersen B, Hischier R, Bauer C (2013) Data quality guideline for ecoinvent database version 3. Swiss Centre for Life Cycle Inventories. Ecoinvent Report 3(1)

  66. LeBlanc P (2017) Total eclipse of reason on green energy child labor, toxic waste, and animal protection. In: NetRightDaily. https://dailytorch.com/2017/08/total-eclipse-reason-green-energy-child-labor-toxic-waste-animal-protection/. Accessed 24 Jan 2020

  67. Lombardozzi B (2014) The True Cost of Chinese Solar Panels: Part 3. In: Alliance Am. Manuf. https://www.americanmanufacturing.org/blog/the-true-cost-of-chinese-solar-panels-part-3/. Accessed 15 Jan 2020

  68. Zeballos-Roig J, Wang A (2019) Americans really want the US to adopt renewable energy likewind and solar power, while rejecting fossil fuels like coal. In: Bus. Insid. https://www.businessinsider.com/americans-really-want-the-us-adopt-renewable-energy-sources-2019-10#:~:text=Americans really want the US to adopt renewable energy like, rejecting fossil fuels like coal&text=A Gallup poll from March,sources%2C particularly. Accessed 26 Jan 2020

  69. Department of Energy (2014) Solar Energy in the United States. In: Off. Energy Effic. Renew. Energy. https://www.energy.gov/eere/solarpoweringamerica/solar-energy-united-states. Accessed 26 Jan 2020

  70. Richardson J (2018) Renewable energy has more economic benefits than you know. In: Clean Tech. https://cleantechnica.com/2018/03/10/renewable-energy-economic-benefits-know/. Accessed 28 Jan 2020

  71. United States Bureau of Labor Statistics (BLS) (2019) Solar Photovoltaic Installers. In: Occup. Outlook Handb. https://www.bls.gov/ooh/construction-and-extraction/home.htm. Accessed 28 Jan 2020

  72. Solar Energy Industries Association (2018) Solar Industry Research Data: Solar Industry Growing at a Record Pace. https://www.seia.org/solar-industry-research-data. Accessed 26 Jan 2020

  73. Barrett J, Yadken J (2019) The 2019 U.S. Energy and Employment Report. National Association of State Energy Officials, Energy Futures Initiative. https://www.usenergyjobs.org/2019-report

  74. National Research Energy Laboratory (2020) United States Irradiance Maps. https://www.nrel.gov/gis/solar.html. Accessed 26 Jan 2020

  75. D’Aquila J (2018) The current state of sustainability reporting: a work in progress. CPA J 88(7):44–50

    Google Scholar 

  76. U.S. Department of Energy (2020) U.S. Department of Energy Sustainability Reporting. https://www.energy.gov/management/spd/us-department-energy-sustainability-reporting. Accessed 25 Jan 2020

  77. Vellini M, Gambini M, Prattella V (2017) Environmental impacts of PV technology throughout the life cycle: importance of the end-of-life management for Si-panels and CdTe-panels. Energy 138:1099–1111. https://doi.org/10.1016/j.energy.2017.07.031

    Article  Google Scholar 

  78. Michael Shellenberger (2018) If Solar Panels Are So Clean, Why Do They Produce So Much Toxic Waste? In: Forbes. https://www.forbes.com/sites/michaelshellenberger/2018/05/23/if-solar-panels-are-so-clean-why-do-they-produce-so-much-toxic-waste/#1b37cf32121c. Accessed 10 Jan 2020

  79. Leblanc R (2020) What is the Environmental Impact of Solar Power Generation? https://www.thebalancesmb.com/what-is-the-environmental-impact-of-solar-power-generation-4586409. Accessed 27 Jan 2020

  80. United States Census Bureau (2019) United States Census American Community 5-year Data GINI Coefficient. https://www.census.gov/data/developers/data-sets/acs-5year.html. Accessed 15 Dec 2019

  81. Principles for Responsible Development (2017) Case study: conflict minerals and solar power. In: Environ. Soc. Gov. Issues. https://www.unpri.org/environmental-social-and-governance-issues/addressing-conflict-minerals-in-solar-power-production/621.article. Accessed 26 Jan 2020

  82. Corneau S (2018) Minerals in the Green Economy: solar panels and lithium-ion batteries. Retrieved November, 27, 2020. https://www.igfmining.org/minerals-green-economy-solar-panels-lithium-ion-batteries/. Intergovernmental Forum on Mining, Minerals, Metals and Sustainable Development

  83. Mishkin S (2020) Here’s how much adding solar panels will boost your home’s value. https://money.com/home-value-solar-panels/. Accessed 12 Jan 2020

  84. Wang B (2008) Deaths per TWh for all energy sources: Rooftop solar power is actually more dangerous than Chernobyl. https://www.nextbigfuture.com/2008/03/deaths-per-twh-for-all-energy-sources.html. Accessed 5 Jan 2020

  85. Fluxman C (2019) Yet another fall casualty—solar panel co cited. https://sunnewsreport.com/yet-another-fall-casualty-solar-panel-co-cited/. Accessed 11 Dec 2019

  86. CED Greentech (2020) Can Solar Panels Be Recycled? | CED Greentech. https://www.cedgreentech.com/article/can-solar-panels-be-recycled. Accessed 8 Dec 2019

  87. Arup (2020) Circular photovoltaics: circular business models for Australia’s solar photovoltaics industry. https://www.arup.com/-/media/arup/files/publications/c/circular-photovoltaics.pdf

  88. Sica D, Malandrino O, Supino S et al (2018) Management of end-of-life photovoltaic panels as a step towards a circular economy. Renew Sustain Energy Rev 82:2934–2945. https://doi.org/10.1016/j.rser.2017.10.039

    Article  Google Scholar 

  89. Heath G (2019) Moving toward a circular economy of materials for clean manufacturing. (No. NREL/PR-6A50-73689). National Renewable Energy Lab.(NREL), Golden, CO (United States)

  90. Desai J, Nelson M (2017) Are we headed for a solar waste crisis? Environmental Progress. https://conspiracytech.com/Are%20we%20headed%20for%20a%20solar%20waste%20crisis_%20%E2%80%94%20Environmental%20Progress.pdf

  91. Dini J (2018) Solar panel waste: a disposal problem|watts up with that? https://wattsupwiththat.com/2018/12/23/solar-panel-waste-a-disposal-problem/

  92. Bhandari B, Lim N (2020) The Dark Side of China’s Solar Boom. https://www.sixthtone.com/news/1002631/the-dark-side-of-chinas-solar-boom-#:~:text=The International Renewable Energy Agency, been piling up since 2015. Accessed 5 Jan 2020

  93. Springer N (2018) Sunsetting solar panels: U.S. photovoltaic cell recycling incentives are beginning. GreenBiz. Published Sept 18, 2018. https://www.greenbiz.com/article/sunsetting-solar-panels-us-photovoltaic-cell-recycling-incentives-are-beginning

  94. Office of Energy Efficiency and Renewable Energy (2019) Federal Resources for Community Solar. https://www.energy.gov/eere/solarpoweringamerica/federal-resources-community-solar. Accessed 12 Jan 2020

  95. Misbrener K (2019) Georgia PSC approves net metering for up to 5,000 solar customers. https://www.solarpowerworldonline.com/2019/12/georgia-psc-approves-solar-net-metering/#:~:text=Georgia PSC approves net metering for up to 5%2C000 solar customers,-By Kelsey Misbrener&text=The Commission unanimously passed a, has approximately 1%2C000 part. Accessed 5 Dec 2019

  96. Cook JJ, Shah M (2018) Reducing Energy Burden with Solar: Colorado's Strategy and Roadmap for States (No. NREL/TP-6A20-70965). National Renewable Energy Lab.(NREL), Golden, CO (United States)

  97. Terrell R (2019) High Energy Burdens Keep Low Income Georgians From Benefits Of Solar Power. https://www.gpb.org/news/2019/07/17/high-energy-burdens-keep-low-income-georgians-benefits-of-solar-power. Accessed 5 Jan 2020

  98. Mahoney M, Bennett D, Grushack S (2010) City of Atlanta Sustainability Plan. Mayor’s Office of Sustainability, City of Atlanta

    Google Scholar 

  99. Hall K (2017) Atlanta’s 100 % Clean Energy Plan: Resolution 17-R-3510. Mayor’s Office of Resilience, One Atlanta. https://www.100atl.com/

  100. Rhone N (2019) Athens 4th Georgia city to adopt 100% clean energy plan. Green Environmental Memo. Published May 23, 2019. https://georgiaenvironmentalmemo.com/2019/05/23/athens-4th-georgia-city-to-adopt-100-clean-energy-plan/#:~:text=Athens%20is%20the%20fourth%20city,clean%2C%20renewable%20electricity%20by%202035

  101. Lastinger M (2019) Georgia Power’s commitment to solar energy involves Guyton”. Effingham Herald. Published: Apr 29, 2019, https://www.effinghamherald.net/local/georgia-powers-commitment-solar-energy-involves-guyton/

  102. Hsu A, Kelly ML (2019) How Georgia Became A Surprising Bright Spot In The U. S. Solar Industry”. The Climate Center. Published: June 24, 2019, https://theclimatecenter.org/how-georgia-became-a-surprising-bright-spot-in-the-u-s-solar-industry/

  103. Georgia Solar Energy Association (2018) Georgia’s Most Comprehensive Solar Resource Join GA Solar Today ! Bringing the Benefits of Solar to Georgia. https://www.gasolar.org/#:~:text=Georgia’s Most Comprehensive Solar Resource, and opportunities you can use. Accessed 27 Jan 2020

  104. Kempner M (2019) Georgia solar factory scores on tariffs ; others in industry take a hit

  105. Rodgers BL (2020) Protesters urge PECO to invest in solar energy. https://www.dailylocal.com/news/protesters-urge-peco-to-invest-in-solar-energy/article_cdeb60a8-a813-11e9-900d-5f75ccb2e7f1.html. Accessed 20 Jan 2020

  106. Hausman S (2019) Public Protest Casts a Shadow on Solar Arrays”. Radio IQ. Published March 11, 2019, https://www.wvtf.org/news/2019-03-11/public-protest-casts-a-shadow-on-solar-arrays

  107. Broström T, Svahnström K (2011) Solar Energy and Cultural-Heritage Values. In World Renewable Energy Conference, Linköping, May 2011. Linköping University Electronic Press. pp 2034–2040

  108. Dauenhauer PM, Frame D, Eales A et al (2020) Sustainability evaluation of community-based, solar photovoltaic projects in Malawi. Energy Sustain Soc 10:12. https://doi.org/10.1186/s13705-020-0241-0

    Article  Google Scholar 

  109. Tidwell J, Tidwell A, Nelson S (2018) Surveying the solar power gap: assessing the spatial distribution of emerging photovoltaic solar adoption in the State of Georgia, USA. Sustainability 10:4117. https://doi.org/10.3390/su10114117

    Article  Google Scholar 

  110. Tidemann C, Engerer N, Markham F et al (2019) Spatial disaggregation clarifies the inequity in distributional outcomes of household solar PV installation. J Renew Sustain Energy 11:035901. https://doi.org/10.1063/1.5097424

    Article  Google Scholar 

  111. Hsu J (2019) Solar Power’s Benefits Don’t Shine Equally on Everyone. https://www.scientificamerican.com/article/solar-powers-benefits-dont-shine-equally-on-everyone/. Accessed 15 Dec 2019

  112. Department of Energy (2020) Low Income Community Energy Solutions Partnering with State and Local Governments. https://www.energy.gov/eere/slsc/low-income-community-energy-solutions

  113. Walton BR (2019) Regulators unanimously approve Georgia Power plan, adding 80 MW storage”. Utility Dive. Published July 17, 2019. https://www.utilitydive.com/news/regulators-unanimously-approve-georgia-power-plan-including-80-mw-energy-s/558919/

  114. Daniel J (2019) The Energy Burden: How Bad is it and How to Make it Less Bad. Union of Concerned Scientists: The Equation. Published Feb 26, 2019. https://blog.ucsusa.org/joseph-daniel/how-to-make-energy-burden-less-bad/

  115. U.S Departments of Energy (2018) Low-income household energy burden varies among states—efficiency can help in all of them. US DOE Office of Energy Efficiency and Renewable Energy. Published December 2018. DOE/GO-102018-5122. https://www.energy.gov/sites/prod/files/2019/01/f58/WIP-Energy-Burden_final.pdf

  116. U.S Department of Energy (2018). “A Consumer’s Guide to Fire Safety with Solar Systems”. Office of Energy Efficiency & Renewable Energy, Solar Energy Technologies Office. https://www.energy.gov/eere/solar/consumers-guide-fire-safety-solar-systems

  117. Simpson K (2020) Solar Energy : Safety Risks and How to Prevent Them. https://www.thehartford.com/resources/energy/solar-energy-risks. Accessed 12 Jan 2020

  118. Marsh J (2019) Solar Panel Safety: Are Solar Panels Safe. EnergySage. https://news.energysage.com/solar-panel-safety-need-know/

  119. Lee K (2020) What Are the Dangers of Solar Panels? In: Seattlepi. https://education.seattlepi.com/dangers-solar-panels-6127.html. Accessed 10 Jan 2020

  120. Bruggers J (2018) How Georgia Became a Top 10 Solar State, With Lawmakers Barely Lifting a Finger”. Inside Climate News: Clean Energy. Published June 14, 2018. https://insideclimatenews.org/news/14062018/georgia-solar-power-renewable-utility-scale-clean-energy-investments-2018-election/

  121. Prieto C, Gunning S (2019) Utility barriers to rooftop solar in Georgia. In: PV Mag. https://pv-magazine-usa.com/2019/11/04/utility-barriers-to-rooftop-solar-in-georgia/. Accessed 20 Jan 2020

  122. Food and Agriculture Organization of the United Nations, Deutsche Gesellschaft für Internationale Zusammenarbeit GmbH, Global Alliance for Clean Cookstoves, et al (2018). “The Global Plan of Action for Sustainable Energy Solutions in Situations of Displacement: Framework for Action”. https://unitar.org/sites/default/files/media/file/gpa_flyer_september_2018_3.pdf

  123. Deaton J (2018) Green Gentrification Comes With Its Own Curse. In: Clean Tech. https://cleantechnica.com/2018/02/03/green-gentrification-comes-curse/. Accessed 12 Jan 2020

  124. Jossi F (2019) Solar is thriving in low-income Minneapolis neighborhoods. https://apnews.com/article/639edff630f04b7e88208c767eea5581. Accessed 12 Dec 2019

  125. Marincola L (2017) Making solar power affordable in developing countries. In: UCLA Inst. Environ. Sustain. https://www.ioes.ucla.edu/news/making-solar-power-affordable-developing-countries/

  126. The World Bank (2017) Results Briefs: Solar. In: A 750 ultra-mega Sol. plant will Help power Delhi’s metro rail Syst. India. https://www.worldbank.org/en/results/2017/11/29/solar. Accessed 4 Jan 2020

  127. Natational Renewable Energy Laboratory (NREL) (2019) 2019 National Solar Radiation Database (NSRDB) US National Horizontal Irradiance Values. https://maps.nrel.gov/nsrdb-viewer/?aL=chXUF-%255Bv%255D%3Dt%26f69KzE%255Bv%255D%3Dt%26f69KzE%255Bd%255D%3D1&bL=clight&cE=0&lR=0&mC=40.27952566881291%2C-108.10546875&zL=5. Accessed 4 Feb 2020

  128. Guzman, G. G., (2018) “2017 United States Gini Coefficient Values for States”. ACSBR/17-01. United States Census Bureau. Issued September 2018. https://www.census.gov/content/dam/Census/library/publications/2018/acs/acsbr17-01.pdf

  129. Wang B (2008) Deaths per TWh for all energy sources: Rooftop solar power is actually more dangerous than Chernobyl”. Next Big Future. Published March 14, 2008. https://www.nextbigfuture.com/2008/03/deaths-per-twh-for-all-energy-sources.html

  130. Maile K (2019) Shining a light on solar panel recycling. In: Recycl. Today. https://www.recyclingtoday.com/article/end-of-life-solar-panel-recycling/. Accessed 18 Jan 2020

  131. Wesoff E, Beetz B (2020) Solar panel recycling in the US—a looming issue that could harm industry growth and reputation. In: PV Mag. https://pv-magazine-usa.com/2020/12/03/solar-panel-recycling-in-the-us-a-looming-issue-that-could-harm-growth-and-reputation/. Accessed 2 Jan 2021

  132. Pickerel K (2019) Hanwha Q CELLS completes 1.7-GW panel assembly facility in Georgia. In: Sol. Power World. https://www.solarpowerworldonline.com/2019/02/hanwha-q-cells-completes-1-7-gw-panel-assembly-facility-in-georgia/. Accessed 20 Feb 2020

  133. Georgia Department of Energy (2020) Georgia Solar Energy Industry Overview. https://www.georgia.org/sites/default/files/wp-uploads/2013/09/Solar-Industry-in-Georgia.pdf. Accessed 19 Jan 2020

  134. Technology Association of Georgia (2020) Where Georgia Leads: Smart Energy. https://www.tagonline.org/wp-content/uploads/2019/12/smart-energy-brochure.pdf. Accessed 20 Jan 2020

  135. Solar Energy Industries Association (2019) U.S. Solar Market Insight. https://www.seia.org/us-solar-market-insight. Accessed 18 Jan 2020

  136. Alcorn J, Jones S, O’looney J (2016) Going Solar in Georgia: Opportunities for Local Government”. Carl Vinson Institute of Government, The University of Georgia. https://cviog.uga.edu/_resources/documents/publications/going-solar-in-georgia.pdf

  137. Replogle J (2010) Solar Installer’s Death Points to Job Hazards in a Growing Green Industry. https://www.fairwarning.org/2010/10/solar-installers-death-points-to-job-hazards-in-a-growing-green-industry/. Accessed 6 Jan 2020

Download references

Acknowledgements

The authors wish to acknowledge the anonymous experts that participated in the online survey. These experts provided feedback regarding the challenges identified in the social assessment field through a literature review exercise. In addition, the authors want to express the feedback and participation of the students and professors for the Capstone Engineering Design Course in the School of Mechanical Engineering of the Georgia Institute of Technology.

Funding

Not applicable.

Author information

Affiliations

Authors

Contributions

Both authors contributed significantly to the completion of the work presented in the manuscript. RJB-A contributed to the development of the research plan and completed the case study application of the framework. KF contributed significantly to the structuring of the document, draft preparation, and editing process of the manuscript. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Katherine Fu.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Reference indicator database.

Additional file 2.

S-LCA Impact Analysis of Solar Panels.

Additional file 3.

S-LCA capstone report rubric for qualitative assessment.

Additional file 4.

S-LCA Framework.

Additional file 5.

Data quality assessment criteria based on the SIA Handbook [58] and the Pedigree matrix method [62].

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bonilla-Alicea, R.J., Fu, K. Social life-cycle assessment (S-LCA) of residential rooftop solar panels using challenge-derived framework. Energ Sustain Soc 12, 7 (2022). https://doi.org/10.1186/s13705-022-00332-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13705-022-00332-w

Keywords

  • Social life-cycle assessment (S-LCA)
  • S-LCA framework
  • Type I S-LCA
  • Case study analysis
  • Life-cycle sustainability assessment (LCSA)
  • Social assessment of technology
  • Rooftop solar panel