Xu, Jinlai
(2022)
Geo-distributed Edge and Cloud Resource Management for Low-latency Stream Processing.
Doctoral Dissertation, University of Pittsburgh.
(Unpublished)
This is the latest version of this item.
Abstract
The proliferation of Internet-of-Things (IoT) devices is rapidly increasing the demands for efficient processing of low latency stream data generated close to the edge of the network.
Edge Computing provides a layer of infrastructure to fill latency gaps between the IoT devices and the back-end cloud computing infrastructure.
A large number of IoT applications require continuous processing of data streams in real-time.
Edge computing-based stream processing techniques that carefully consider the heterogeneity of the computing and network resources available in the geo-distributed infrastructure provide significant benefits in optimizing the throughput and end-to-end latency of the data streams.
Managing geo-distributed resources operated by individual service providers raises new challenges in terms of effective global resource sharing and achieving global efficiency in the resource allocation process.
In this dissertation, we present a distributed stream processing framework that optimizes the performance of stream processing applications through a careful allocation of computing and network resources available at the edge of the network.
The proposed approach differentiates itself from the state-of-the-art through its careful consideration of data locality and resource constraints during physical plan generation and operator placement for the stream queries.
Additionally, it considers co-flow dependencies that exist between the data streams to optimize the network resource allocation through an application-level rate control mechanism.
The proposed framework incorporates resilience through a cost-aware partial active replication strategy that minimizes the recovery cost when applications incur failures.
The framework employs a reinforcement learning-based online learning model for dynamically determining the level of parallelism to adapt to changing workload conditions.
The second dimension of this dissertation proposes a novel model for allocating computing resources in edge and cloud computing environments.
In edge computing environments, it allows service providers to establish resource sharing contracts with infrastructure providers apriori in a latency-aware manner.
In geo-distributed cloud environments, it allows cloud service providers to establish resource sharing contracts with individual datacenters apriori for defined time intervals in a cost-aware manner.
Based on these mechanisms, we develop a decentralized implementation of the contract-based resource allocation model for geo-distributed resources using Smart Contracts in Ethereum.
Share
Citation/Export: |
|
Social Networking: |
|
Details
Item Type: |
University of Pittsburgh ETD
|
Status: |
Unpublished |
Creators/Authors: |
|
ETD Committee: |
|
Date: |
17 January 2022 |
Date Type: |
Publication |
Defense Date: |
23 November 2021 |
Approval Date: |
17 January 2022 |
Submission Date: |
8 December 2021 |
Access Restriction: |
No restriction; Release the ETD for access worldwide immediately. |
Number of Pages: |
167 |
Institution: |
University of Pittsburgh |
Schools and Programs: |
School of Computing and Information > Information Science |
Degree: |
PhD - Doctor of Philosophy |
Thesis Type: |
Doctoral Dissertation |
Refereed: |
Yes |
Uncontrolled Keywords: |
resource management, stream processing, edge computing, resource sharing, cloud computing, reinforcement learning, blockchain |
Related URLs: |
|
Date Deposited: |
17 Jan 2022 15:04 |
Last Modified: |
17 Jan 2022 15:04 |
URI: |
http://d-scholarship.pitt.edu/id/eprint/42094 |
Available Versions of this Item
Metrics
Monthly Views for the past 3 years
Plum Analytics
Actions (login required)
|
View Item |