Lawmakers want Minnesota to reconsider nuclear plants



Advocates for nuclear energy hope this is the year Minnesota takes a step toward lifting its 32-year-old moratorium on new nuclear power plants in the state.

A coalition of utilities, counties, clean energy groups and labor unions known as the Minnesota Nuclear Energy Alliance is pushing the Legislature to reconsider the ban.

They say Minnesota needs nuclear energy to help meet a state law requiring utilities to provide carbon-free power by 2040, and to meet the growing demand for electricity.

“We do feel like having a carbon-free requirement in this state and having … basically one resource that's available at all times and carbon free, it really makes it important that that resource be on the table for our state's future,” said Darrick Moe, president and CEO of the Minnesota Rural Electric Association, which represents the state’s electric cooperatives.

Moe testified in favor of the bill during a Senate committee hearing on Wednesday.

Currently, Minnesota has a moratorium on new nuclear plants. The Legislature enacted it in 1994, as part of a compromise to allow Xcel Energy to store nuclear waste at its Prairie Island plant near Red Wing.

As a first step toward opening the door to new nuclear energy in Minnesota, some state lawmakers are backing a bill authorizing a study of the potential to build new nuclear plants, including small, modular reactors that are currently in development in the U.S.

The study would examine costs, federal regulations, technological advances, location issues and environmental impacts, among other factors. The proposal received a mixed response this week during committee hearings at the Minnesota Capitol.

Xcel Energy operates two nuclear plants in Minnesota, at Prairie Island and Monticello, both built in the 1970s. It plans to continue operating them for the next several decades.

As utilities retire coal-fired power plants and shift toward more renewable energy sources such as wind and solar, interest in nuclear has been surging in recent years. Its supporters see it as a constant, low-carbon energy source that could help meet the anticipated surge in demand for electricity for data centers, EVs and manufacturing.

Several U.S. states have recently eased their bans on new nuclear plants. And the Trump administration has been pushing the development of new, experimental reactors around the U.S.

But nuclear energy comes with its own challenges, including high construction costs, safety concerns and the problem of storing radioactive nuclear waste.

Currently, there is no permanent storage site for spent nuclear fuel in the U.S., so it is stored on site at nuclear plants, including Prairie Island and Monticello.

Electrical lines and towers frame a power plant
The Xcel Energy nuclear generating plant near Monticello, Minn. is pictured on March 24, 2023.
Ben Hovland | MPR News

Legislators have proposed a nuclear study in previous sessions, but it’s failed to pass. This year’s bill has the support of the Prairie Island Indian Community, which lawmakers from both parties see as essential to its passage.

Prairie Island’s reservation is just 700 yards from a nuclear plant owned by Xcel Energy. Xcel built the plant in the early 1970s without consent from the tribe, which has long objected to the storage of spent nuclear fuel at the site.

Both House and Senate versions of the bill require that that study includes nuclear waste storage and impacts on surrounding communities.

The bill’s supporters noted that a study isn’t a guarantee that a nuclear plant will be built in Minnesota anytime soon, if ever.

“We're not permitting anything,” said Rep. Spencer Igo, R-Wabana Township, the House bill’s sponsor. “We're not making anything happen. But we’re at least going to start opening the order to learn about what it looks like for Minnesota.”

The proposal has encountered a few bumps. A Senate version of the bill, sponsored by Sen. Andrew Mathews, R-Princeton, passed out of the energy committee on Wednesday. But a House energy committee postponed taking action on a similar bill, rather than advancing it.

Some DFL lawmakers said they thought the bill needs more work. They expressed concern that funding for the study would come from the state’s Renewable Development Account. Xcel pays annual fees into the account for storing nuclear waste at Prairie Island and Monticello, and the money is used to develop renewable energy projects.

Rep. Patty Acomb, DFL-Minnetonka, said she wants the study to compare the costs of nuclear energy with other types of electricity generation.

After the House committee’s vote, Igo issued a statement, saying Minnesota’s carbon-free energy goals “make it critical to explore all viable energy sources.”

“Nuclear energy has the potential to lower costs, create economic opportunities, and strengthen Minnesota’s energy future,” he stated. “This bill is designed to advance a study that will help identify the most effective path forward.”



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


AWS Edge Locations: A Brief Introduction 

AWS Edge locations are third-party data centers made to ensure minimal latency while delivering services. It is essentially a small setup, located very close to the user using the AWS service to make the responses quick.

When you look at the situation more closely, what’s happening is that when a user is sending a request, instead of receiving a response from the primary server, it routes to the nearest edge location and provides the response from there, making it quick.

For instance, if your data is housed in an S3 bucket in Australia, some of your traffic comes from Canada. In this example, AWS will start caching your data in one of the edge locations in Canada, so when a request arrives from there, it’ll be delivered from the cache edge location in Canada, avoiding the need for the request to come to Australia. As a result, it will lower the latency, resulting in a better excellent user experience.

The Edge location is popular for providing a speedier response to user requests, aiming to minimize access time and delivery delay. They are located in almost all of the world’s major cities and are utilized by CloudFront (CDN) for fast deliveries to end-users to minimize latency.

Take your career to next level in AWS with HKR. Enroll now to get Aws Training

AWS Training

  • Master Your Craft
  • Lifetime LMS & Faculty Access
  • 24/7 online expert support
  • Real-world & Project Based Learning

Who Uses AWS Edge Locations?

A set of services that use edge locations and are take latency into consideration are:-

CloudFront: It makes use of edge locations to cache versions of the data it provides, allowing the content to be delivered to users more quickly.
Route 53: It delivers DNS responses from edge locations, allowing DNS queries to be resolved more quickly.
AWS Shield and Web Application Firewall: It screens traffic in edge locations to prevent undesired traffic.

Benefits of AWS Edge Locations

Quick Response: With it being located very close to the place the request comes from, the Edge location is able to deliver a fast response as static content is delivered.

Minimal Access Time: Since the edge locations can offer quick response, this directly helps reduce the access time for the user.

Low Latency Rate: Edge location is physically closer to the user than the primary server. Thus, it has a lower latency rate.

Broader Reach: Edge locations, which are often housed in colocation facilities, increase the scope of the AWS network. They have ample bandwidth and connections to other networks and service providers, and this provides AWS with a wide range of connectivity, even domestic ISPs.

Edge Locations In India

Multiple CloudFront edge locations can be found in India. There are approximately 17 such locations– 4 each in Hyderabad and  New Delhi, 3 each in Bangalore and Mumbai, 2 in Chennai, and 1 in Kolkata. Globally, there are approximately 44 AWS edge locations.

Edge Locations Vs. Availability Zones Vs. AWS Regions

AWS Regions

What will happen in the event of unanticipated situations, such as a natural calamity? This problem gets solved by grouping the data centers into Regions, and these Regions are established worldwide to be proximate to business traffic demand.

To begin, AWS offers a variety of data centers across all Regions that provide various computation, storage, and other valuable resources for hosting your apps. Second, a high-speed fiber network connects all of the Regions. AWS effectively manages this network. Finally, all Regions are separated from one another. It ensures that no data can enter or leave your area in a specific Region. The only exception is if you explicitly authorize the movement of such data.

Availability Zones

Availability Zone (AZ) comprises one or more separate data centers in a particular region that provide redundant power supply, network, and connection. These centers are housed in different buildings. Users can operate production apps and databases in Availability Zones that are quickly available, have fault tolerance and are more scalable than single data centers. There are currently 84 Availability Zones spread over 26 geographic regions around the world.

Despite the fact that each Availability Zone is autonomous, they are linked by low-latency connections within a specific Region. Users have enough freedom with AWS to place instances and store data across many geographical regions and multiple Availability Zones within each Region.

Join our CLCEI Training today and enhance your skills to new heights!

, Cloud Technologies, aws-edge-locations-description-0, , Cloud Technologies, aws-edge-locations-description-1

Subscribe to our YouTube channel to get new updates..!

Edge Locations

What if the users are located in various parts of the globe or in locations that are not in your AWS Regions? Luckily, your organization will not need to start building a new data center. As already explained, this problem gets solved with the help of AWS Edge Locations.

Amazon CloudFront is an AWS service that lets you provide information, video, apps, and APIs to clients across the globe. Low latency and high transmission rates are provided via Amazon CloudFront. But the most crucial aspect is that this service makes use of so-called Edge locations to speed up the connection with customers irrespective of their location.

An organization can send content from Regions to a specific set of Edge locations around the world because Edge locations and Regions are independent infrastructure components. This enables both communication and content delivery to be accelerated. At the same time, Amazon Route 53, a well-known domain name service (DNS) on AWS, is running in Edge locations. This ensures reduced latency by directing clients to the proper web pages.

What is AWS CloudFront?

AWS Cloudfront is an excellent content delivery network (CDN) solution that is extremely fast and capable of delivering data to all users across the world with minimal delay.

The crucial aspect here is that your data is highly secure, thanks to a variety of solid security measures and encryption algorithms, and it’s well connected with Amazon Route 53, AWS Shield, and AWS Web Application Firewall, among other things, to defend it from various forms of attacks.

Why Choose Amazon CloudFront?

Let us find out why people prefer AWS CloudFront and why you should also choose the same. We have compiled its many benefits below:

Quick Content Delivery

The Amazon Cloudfront network has more than 200 points of contact, allowing you to deliver content to AWS consumers and users quickly and with minimal latency. Most significantly, it is incredibly accessible to AWS users and customers.

Pocket Friendly

Amazon CloudFront offers a pay-as-you-go pricing structure with a handful of customizable pricing plans to help you save money.

High Security

Amazon CloudFront is among the most secure content delivery networks available, and it can help you secure both your application and your network.

Compatibility with AWS Services

Amazon CloudFront is compatible with other AWS services such as Amazon EC2, Amazon S3, and Elastic Load Balancing.

It assists developers with AWS Cloud Development Kit, various APIs, and log monitoring, and it can simply interface with Amazon Cloudwatch, among other things, making the developer’s job easier.

Will using the edge result in lower-latency access to EC2?

Using the edge locations can potentially result in lower-latency access to EC2 instances. An edge location is a site that CloudFront uses to cache copies of your content, enabling faster delivery to users at any location. While it may not directly improve latency to EC2 instances, utilizing edge locations can help improve latency to certain AWS services.

To fully resolve any latency issues, it would be beneficial if AWS were to establish a new region in Africa. AWS regions consist of multiple availability zones, each functioning as a separate datacenter and providing low-latency connectivity within that region. By having a region in Africa, users in the continent would experience improved latency when accessing AWS resources.

It is important to note that edge locations primarily serve requests for CloudFront, which is a content delivery network (CDN). CDN technologies aim to reduce latency by caching static content closer to end users. In combination with AWS CloudFront, edge locations help optimize content delivery and provide low-latency connectivity.

While edge locations play a crucial role in delivering content efficiently through CloudFront, AWS Route 53 is responsible for DNS services. Requests made to CloudFront or Route 53 are automatically routed to the nearest edge location, ensuring low latency regardless of the user’s location.

Is the edge just a way to speed up services’ frontends or the services themselves?

The edge serves as a means to enhance the performance of both service frontends and the services themselves. It allows for improved access to various AWS services, which includes an extensive range of options. While it can potentially enhance latency to certain services, it does not solely focus on speeding up frontends. To truly address latency issues, the deployment of a new AWS region in Africa would be most beneficial.

Top 30 frequently asked AWS Devops Interview Questions!

AWS Training

Weekday / Weekend Batches

What does “access services located at AWS” mean in this context?

In this context, the phrase “access services located at AWS” refers to the ability to utilize and make use of the various services available on the Amazon Web Services (AWS) platform. AWS offers a wide range of services for computing, storage, databases, networking, security, analytics, machine learning, and more. By accessing these services, users can leverage the capabilities and functionalities provided by AWS to meet their specific requirements.

It is important to note that there are numerous AWS services available, with hundreds of options to choose from. These services cover various aspects of cloud computing and cater to different business needs. Examples of AWS services include Amazon S3 for scalable storage, Amazon EC2 for virtual servers, Amazon RDS for managed databases, Amazon Redshift for data warehousing, AWS Lambda for serverless computing, and many others.

While accessing different AWS services can offer potential benefits, such as improved efficiency and flexibility, it may not necessarily address latency issues directly. Latency refers to the time delay experienced when transmitting data over a network, and accessing AWS services on their own may not have a significant impact on improving latency.

To address latency issues more effectively, an ideal solution would be the deployment of a new AWS region in Africa. A region in closer proximity to the users in Africa would minimize the distance data needs to travel, reducing latency and improving the overall performance of AWS services for users in that region.

Can S3 objects be cached via Edge locations?

Yes, S3 objects can be cached via Edge locations using CloudFront. Although S3 itself does not have the direct capability to cache objects, CloudFront, which is a content delivery network (CDN) service provided by Amazon Web Services (AWS), can be used to cache and distribute S3 objects to Edge locations.

CloudFront acts as an intermediary between S3 and the end users accessing the objects. When a user requests an S3 object, CloudFront checks if it already has a cached copy of that object in one of its Edge locations. If the object is found in the cache, CloudFront delivers it directly from the Edge location, resulting in reduced latency and improved performance. If the object is not in the cache, CloudFront retrieves it from the S3 bucket, stores it in its cache, and then delivers it to the user.

By caching S3 objects via CloudFront’s Edge locations, the objects become readily available at locations closer to the end users, reducing the need for requests to be sent back to the S3 origin server. This not only improves the overall performance and responsiveness of accessing S3 objects but also helps mitigate network congestion and latency.

Can Route 53 automatically route to Edge locations based on latency?

Route 53 does not have the capability to automatically route to edge locations based on latency. While Route 53 does offer various routing policies that can be configured based on different factors such as geographic location, latency, and weighted distribution, it does not specifically route based on latency to edge locations. It’s important to note that low latency does not necessarily imply the proximity of the nearest edge location. If you would like to understand more about how Route 53 works and the different types of records it supports, I can provide you with more information on that as well.

What services do Edge locations serve requests for?

Edge locations serve requests for CloudFront and Route 53. CloudFront is a globally distributed content delivery network designed to deliver content with low latency, high transfer speeds, and high availability. It acts as a cache, storing frequently accessed content and serving it from the closest edge location to the end user, regardless of their geographical location. Route 53 is a highly scalable and reliable DNS (Domain Name System) service that routes end user requests to the appropriate resources, such as websites or applications, based on the domain name. By leveraging edge locations, both CloudFront and Route 53 ensure that requests are automatically routed to the nearest edge location, resulting in reduced latency and providing a high-performance experience to end users, regardless of their location.

In A Nutshell

These AWS edge locations provide consumers with stable network connectivity, reduced latency, and maximum throughput. Are you wondering if you have ever made use of AWS edge location? You probably have if you have ever used AWS or are an AWS customer. Services like CloudFront and Route 53 already provide edge location benefits, which means you have directly or indirectly used AWS edge locations. So, next time you see a quick response, you know who to thank for it.

Related Article: 

AWS Workplaces



Source link