Maps SwiftUI
Cover Page
DUE Wed, 11/20, 2 pm
The goals of this lab are threefold: first, to introduce you to Apple’s CoreLocation
API, which is part of the MapKit
SDK. Second, to introduce you to asynchronous event stream in the form of Swift’s AsyncSequence
and the AsyncStream
generator. And third, to integrate Apple’s MapKit with the Chatter app.
In the map-augmented Chatter app, we will add a Map View. On the map, there will be one or more markers. Each marker represents a posted chatt
. If you click on a marker, it displays the poster’s username, message, timestamp, and their geodata, consisting of their geolocation and velocity (compass-point facing and movement speed), captured at the time the chatt
was posted. If a chatt
was posted with the user’s geodata
, the timeline now shows the chatt
with a location button. Clicking this button brings user to the MapView
with the chatt
’s posted location marked on the map.
We will also implement a swiping gesture to allow users to switch from the main timeline view to the map view. When a user swipes left to transition from the timeline view to the map view, the current trove of retrieved chatts
will each be displayed as an individual marker. From the map view, users can not post a chatt
; they can only return to the timeline view. Once a user posts a chatt
, they also can only return to the timeline view, not the map view. User also cannot initiate a new retrieval of chatts
in the map view.
Reading the magnetometer part of this lab requires access to a physical device. While the iPhone simulator can simulate location data, it cannot simulate the magnetometer used to determine facing.
Expected behavior
Post a new chatt and view chatts on MapView
:
DISCLAIMER: the video demo shows you one aspect of the app’s behavior. It is not a substitute for the spec. If there are any discrepancies between the demo and the spec, please follow the spec. The spec is the single source of truth. If the spec is ambiguous, please consult the teaching staff for clarification.
Preparing your GitHub repo
- On your laptop, navigate to
YOUR_LABSFOLDER/
- Unzip your
chatter.zip
that you created as part of the audio lab - Rename the newly unzipped
chatter
folder mapsIf there’s a
DerivedData
folder in yourmaps/swiftUIChatter/
, delete it - Push your local
YOUR_LABSFOLDER/
repo to GitHub and make sure there’re no git issues<summary>git push</summary>
- Open GitHub Desktop and click on
Current Repository
on the top left of the interface - Click on your
441
GitHub repo - Add Summary to your changes and click
Commit to master
(orCommit to main
) - Since you have pushed your back end code, you’ll have to click
Pull Origin
to synch up the repo on your laptop - Finally click on
Push Origin
to push changes to GitHub
- Open GitHub Desktop and click on
Go to the GitHub website to confirm that your folders follow this structure outline:
441
|-- # files and folders from other labs . . .
|-- maps
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
|-- # files and folders from other labs . . .
If the folders in your GitHub repo does not have the above structure, we will not be able to grade your labs and you will get a ZERO.
Collecting sensor updates as asynchronous stream
We first go through the details about how to get user’s geolocation information (latitude (lat), longitude (lon), and velocity data (facing and speed)).
We want to receive continuous updates of the phone’s location and heading. Apple’s CoreLocation
API has recently been updated to deliver location updates as an AsyncSequence
. The heading updates API, however, still relies on delegated callback. We will use AsyncStream
to convert the delegated callback to yield an AsyncSequence
of heading updates. Conceptually, understanding AsyncSequence
, its use and its generation using AsyncStream
, is the most technical aspect of this lab.
Requesting permission
To get user’s location, we must first request user’s permission and explain why we need the location data. Similar to how we had to provide justification for requesting mic access in earlier lab, go ahead and enter the justification for requesting Privacy - Location When In Use Usage Description
in your Info.plist
file. Unlike with the mic, iOS doesn’t automatically prompt for location permission. So we will have to do it manually later.
If you accidentally choose “Allow Once” or “Don’t Allow” when your app requests permission, on iOS 14 or later, go to
Settings > Privacy > Location Services
, select your app and tap “Ask Next Time” to reset it.
Location Manager
Create a Swift file and name it LocManager
. At the top of the file, import MapKit
and Observation
. Next create an observable LocManager
singleton:
@Observable
final class LocManager: NSObject, CLLocationManagerDelegate {
static let shared = LocManager()
private let locManager = CLLocationManager()
override private init() {
super.init()
// configure the location manager
locManager.desiredAccuracy = kCLLocationAccuracyBest
locManager.delegate = self
}
}
Inside the class we create an instance of CLLocationManager
, specify our desired location accuracy, and assign self
as the delegate to handle CLLocationManagerDelegate
callbacks. To be the delegate, the class must adopt CLLocationManagerDelegate
, which requires that the class must be a subtype of NSObject
, which means its designated initializer must override
that of NSObject
and it must call super.init()
to initialize NSObject
.
To retrieve the location updates provided by MapKit
, add the following code to your LocManager
class:
private(set) var location = CLLocation()
func startUpdates() {
if locManager.authorizationStatus == .notDetermined {
// ask for user permission if undetermined
// Be sure to add 'Privacy - Location When In Use Usage Description' to
// Info.plist, otherwise location read will fail silently,
// with (lat/lon = 0)
locManager.requestWhenInUseAuthorization()
}
Task {
do {
for try await update in CLLocationUpdate.liveUpdates() {
location = update.location ?? location
}
} catch {
print(error.localizedDescription)
}
}
}
As mentioned, iOS doesn’t automatically prompt the user for location permission. Instead, we first call CLLocationManager.requestWhenInUseAuthorization()
to present the prompt. You can use either requestWhenInUseAuthorization()
or requestAlwaysAuthorization()
, corresponding to the permission key-value pair you listed in your Info.plist
file.
After obtaining user’s permission, we initialize the observable subject location
with an instance of CLLocation
. We use for try await
to collect updates from the AsyncSequence
returned by CloLocationUpdate.liveUpdates()
and store the latest update in the location
observable.
Ideally, we start and stop location updates only as needed. In this lab, however, we will be needing location information in both MainView
, when swiping left to MapsView
, and PostView
, when posting a chatt
, we thus start collecting location updates upon initialization of the app in swiftUIChatterApp
and never stop collecting updates for the lifetime of the app. Add the following to the initializer of your swiftUIChatterApp
(in its eponymous file) after the call to getChatts()
:
LocManager.shared.startUpdates()
With each location update, we also get a speed update from the device. Back in LocManager.swift
, add a computed property to the LocManager
class to read the latest speed with the following code:
var speed: String {
switch location.speed {
case 0.5..<5: "walking"
case 5..<7: "running"
case 7..<13: "cycling"
case 13..<90: "driving"
case 90..<139: "in train"
case 139..<225: "flying"
default: "resting"
}
}
Next we add a property to LocManager
to store heading updates and a computed property that returns the heading in a human-friendly compass direction:
private var heading: CLHeading? = nil
private let compass = ["North", "NE", "East", "SE", "South", "SW", "West", "NW", "North"]
var compassHeading: String {
return if let heading {
compass[Int(round(heading.magneticHeading.truncatingRemainder(dividingBy: 360) / 45))]
} else {
"unknown"
}
}
Observing computed properties
According to Apple’s documentation, “Observation also supports tracking of computed properties when the computed property makes use of an observable property.” In our case, location
has a public getter, changes to location
change speed
and is tracked; whereas both heading
and compass
are private and so compassHeading
should not be tracked.
Unlike location, MapKit
does not provide an API to read heading updates as an AsyncSequence
, instead it still relies on delegated callback to return heading updates. We now use the delegated callback mechanism to yield an event stream using AsyncStream
. Add the following to your LocManager
class:
@ObservationIgnored var headingStream: ((CLHeading) -> Void)?
func locationManager(_ manager: CLLocationManager, didUpdateHeading newHeading: CLHeading) {
headingStream?(newHeading)
}
var headings: AsyncStream<CLHeading> {
AsyncStream(bufferingPolicy: .bufferingNewest(1)) { cont in
headingStream = { cont.yield($0) }
cont.onTermination = { @Sendable _ in
self.locManager.stopUpdatingHeading()
}
locManager.startUpdatingHeading()
}
}
First we create a property headingStream()
to hold a closure. We initialize this property when we create an instance of AsyncStream
returned by the getter of the headings
computed property. headingStream
yields the provided argument (a new heading) to the continuation provided to the initialization block of AsyncStream
. We call the headingStream
closure in the delegated callback locationManager(_:newHeading:)
with newly updated heading as its argument. We create headingStream
in the property getter of headings
to avoid cross-property initialization.
In creating the AsyncStream
, we set it to buffer only the latest update if updates arrive faster than we can read them (the default is to buffer an unbounded number of unread updates). The trailing closure in the AsyncStream
instantiation is called by its initializer (once). This initialization closure is provided with a continuation that allows AsyncStream
to yield()
elements to the awaiting construct (represented by the continuation). The elements emitted by AsyncStream
comprises the AsyncSequence
that we can loop through with for-await
. In this case, the provided continuation represents the awaiting for-await
loop.
In the initialization closure, we also call CLLocationManager.startUpdatingHeading()
to start receiving heading updates from device driver. We stop the device driver from updating heading when the stream is terminated, using the continuation’s onTermination
callback. In this case, we have an endless supply of heading updates until we call stopUpdatingHeading()
. In other cases with finite number of events, we can communicate end of stream to the continuation by calling cont.finish()
.
Add another Task
to your startUpdates()
method, right after the existing Task
block, to store the latest heading update to the heading
property:
Task {
for await newHeading in headings {
heading = newHeading
}
}
Apple’s Map has its own mechanism to start and stop location and heading updates when the MapUserLocationButton
is toggled and does not rely on our AsyncSequence
.
Testing asynchronous streams collection
To test your location and heading update streams, add the following to your PostView
’s body
:
Text("\(LocManager.shared.location.coordinate.latitude), \(LocManager.shared.location.coordinate.longitude), \(LocManager.shared.speed), \(LocManager.shared.compassHeading)")
You should see all the fields updated automatically as you move around, facing different directions—if not immediately then after a few seconds.
Remember to remove or comment the line out after you’re done testing.
Posting and displaying geodata
We can now obtain the user’s lat/lon and heading from iOS’s CLLocationManager
. To post this geodata information with each chatt
and later to display it on a map, we first need to update our Chatter
backend and API.
If you haven’t modified your back end to handle geodata
, please go ahead and do so now:
Once you’ve updated your back end, return here to continue work on your front end.
Chatt
We add a new stored property geodata
to the Chatt
struct to hold the geodata associated with each chatt
. We also make the struct conforms to the Hashable
protocol so that we can use it as a tag to identify selected map marker in MapView
later.
struct Chatt: Identifiable, Hashable {
var username: String?
var message: String?
var id: UUID?
var timestamp: String?
var altRow = true
var geodata: GeoData?
// optional: so that we don't need to compare every property
static func == (lhs: Chatt, rhs: Chatt) -> Bool {
lhs.timestamp == rhs.timestamp
}
}
GeoData
Create a new GeoData
class to store the additional geodata. Let’s put our new GeoData
class in a new GeoData.swift
file. We also import MapKit
:
import MapKit
struct GeoData: Hashable {
var lat: Double = 0.0
var lon: Double = 0.0
var place: String = ""
var facing: String = "unknown"
var speed: String = "unknown"
}
GeoData
adopts the Hashable
protocol because we use it in Chatt
, which must be Hashable
due to its use as a tag for map marker in MapView
.
Reverse geocoding
In addition to the lat/lon information, we use Apple’s CLGeocoder
to perform reverse-geocoding to obtain human-friendly place names from lat/lon information. Add the following setPlace()
method to your GeoData
struct:
mutating func setPlace() async {
let geolocs = try? await CLGeocoder().reverseGeocodeLocation(CLLocation(latitude: lat, longitude: lon))
if let geolocs {
place = geolocs[0].locality ?? geolocs[0].administrativeArea ?? geolocs[0].country ?? "Place unknown"
} else {
place = "Place unknown"
}
}
We will call setPlace()
before posting a chatt
to compute and include the place name in the posted chatt
. We have made setPlace()
an asynchronous function so that we can await
on the asynchronous version of reverseGeocodeLocation
to complete before proceeding further (alternatively, another version of reverseGeocodeLocation
uses a completion handler that is executed asynchronously at some undeterminate time).
To present the geodata in a nicely formatted string, add the following computed property to your GeoData
struct. We will use this property to display the geodata information to the user.
var postedFrom: AttributedString {
var posted = try! AttributedString(markdown: "Posted from **\(place)** while facing **\(facing)** moving at **\(speed)** speed.")
["\(place)", "\(facing)", "\(speed)"].forEach {
if !$0.isEmpty {
posted[posted.range(of: $0)!].foregroundColor = .blue
}
}
return posted
}
Post chatt
with geodata
To post chatt
with the poster’s geodata, first annotate your SubmitButton()
in PostView
with @MainActor
because we’ll be accessing location data in the MainActor
:
@MainActor
@ViewBuilder
func SubmitButton() -> some View {
Then replace the whole action closure of Button
in SubmitButton
with the following:
var geodata = GeoData(lat: LocManager.shared.location.coordinate.latitude, lon: LocManager.shared.location.coordinate.longitude, facing: LocManager.shared.compassHeading, speed: LocManager.shared.speed)
Task {
await geodata.setPlace()
ChattStore.shared.postChatt(Chatt(username: username, message: message, geodata: geodata)) {
ChattStore.shared.getChatts()
}
isPresented.toggle()
}
We launch the asynchronous function setPlace()
in a Task
and await
ed its completion before posting the chatt
with the geodata
. We toggle isPresented
after the call to postChatt(_:)
returned (but network posting may not have completed).
Next update postChatt(_:)
in ChattStore.swift
to pass along the geodata. Here’s the updated top part of postChatt(_:)
:
func postChatt(_ chatt: Chatt, completion: @escaping () -> ()) {
var geoObj: Data?
if let geodata = chatt.geodata {
geoObj = try? JSONSerialization.data(withJSONObject: [geodata.lat, geodata.lon, geodata.place, geodata.facing, geodata.speed])
}
let jsonObj = ["username": chatt.username,
"message": chatt.message,
"geodata": (geoObj == nil) ? nil : String(data: geoObj!, encoding: .utf8)]
// ...
Staying in the postChatt(_:)
method, find the declaration of apiUrl
and replace postchatt
with postmaps
in the url construction.
We are now ready to retrieve chatts
from the back end.
getChatts(_:)
Again, find the declaration of apiUrl
and replace getchatts
with getmaps
in the url construction.
To construct Chatt
objects from retrieved JSON data, find the if chattEntry.count == self.nFields {
block in getChatts()
and replace the content of the if
block with:
let geoArr = chattEntry[4]?.data(using: .utf8).flatMap {
try? JSONSerialization.jsonObject(with: $0) as? [Any]
}
self.chatts.append(Chatt(username: chattEntry[0],
message: chattEntry[1],
id: UUID(uuidString: chattEntry[2]!),
timestamp: chattEntry[3],
altRow: idx % 2 == 0,
geodata: geoArr.map {
GeoData(lat: $0[0] as! Double,
lon: $0[1] as! Double,
place: $0[2] as! String,
facing: $0[3] as! String,
speed: $0[4] as! String)
}))
idx += 1
Each string array returned by the back end represents a single chatt
. The fifth entry in each string array, chattEntry[4]
, contains the string holding an “inner” array of geodata. If this string is not nil
, we first convert it to a JSON object which we cast to an [Any]
array. Assuming the cast doesn’t leave us with a nil
, we construct a GeoData
instance with the elements of the array. We can then use this GeoData
instance, along with the other elements of the “outer” array, to construct a Chatt
.
Displaying geodata on timeline
Before we look at how to display geodata information on a map, let’s display it on the chatt
timeline, to confirm that we are posting and retrieving the correct information.
Update our ChattListRow
by wrapping a VStack
around:
if let message = chatt.message {
Text(message).padding(EdgeInsets(top: 8, leading: 0, bottom: 6, trailing: 0))
}
and adding the following item inside the VStack
under the message
:
if let geodata = chatt.geodata {
Text(geodata.postedFrom).padding(EdgeInsets(top: 8, leading: 0, bottom: 6, trailing: 0)).font(.system(size: 14))
}
With these changes, you should now be able to post a chatt
with geodata information and display the geodata alongside retrieved chatt
s. Try it out.
MapView
We support two ways to view geodata on a map:
- viewing the posting location of a single
chatt
, and - viewing the posting locations of all retrieved
chatts
with a swipe-left gesture.
Create a new Swift file and name it MapView
. Put the following imports and MapView
struct in the file:
import SwiftUI
import MapKit
struct MapView: View {
@Binding var cameraPosition: MapCameraPosition
let chatt: Chatt?
@State private var selected: Chatt?
var body: some View {
Map(position: $cameraPosition, selection: $selected) {
if let chatt {
if let geodata = chatt.geodata {
Marker(chatt.username!, systemImage: "figure.wave",
coordinate: CLLocationCoordinate2D(latitude: geodata.lat, longitude: geodata.lon))
.tint(.red)
.tag(chatt)
}
} else {
ForEach(ChattStore.shared.chatts, id: \.self) { chatt in
if let geodata = chatt.geodata {
Marker(chatt.username!, systemImage: "figure.wave",
coordinate: CLLocationCoordinate2D(latitude: geodata.lat, longitude: geodata.lon))
.tint(.mint)
}
}
}
if let chatt = selected, let geodata = chatt.geodata {
Annotation(chatt.username!, coordinate: CLLocationCoordinate2D(latitude: geodata.lat, longitude: geodata.lon), anchor: .topLeading
) {
InfoView(chatt: chatt)
}
.annotationTitles(.hidden)
}
UserAnnotation() // shows user location
}
}
}
We use the chatt
property to determine whether to display the location of a single chatt
or the locations of all chatt
s in the chatts
array in ChattStore
, depending on whether the chatt
property is nil
.
In rendering the Map
, we pass in a cameraPosition
that specifies the coordinates (lat/lon) the camera is pointing at, the zoom level (distance and height), and pitch (angle) of the camera. Additionally, we also shows the user’s current location by calling UserAnnotation()
.
The following modifiers to the call to Map
control whether:
- to add a button to center on the user’s location,
- to display a compass on the map, and
- to add a scale control
.mapControls { MapUserLocationButton() MapCompass() MapScaleView() }
On the map, each chatt
is represented by a marker, displayed at the coordinates (lat/lon) the chatt
was posted from. When the user taps on a chatt
’s marker, it is considered “selected” and both MapView
and Map
will be recomputed and re-rendered. If we’re displaying the location of a single chatt
, the single chatt
is treated as the selected Marker
. To be able to use a chatt
as selected Marker
is why we made the Chatt
struct adopts the Hashable
protocol earlier. When a marker is selected, an annotation is displayed at the coordinates of the marker. We’ll use the following information window to display the annotation:
struct InfoView: View {
let chatt: Chatt
var body: some View {
VStack(alignment: .leading) {
HStack {
if let username = chatt.username, let timestamp = chatt.timestamp {
Text(username).padding(EdgeInsets(top: 4, leading: 8, bottom: 0, trailing: 0)).font(.system(size: 16))
Spacer()
Text(timestamp).padding(EdgeInsets(top: 4, leading: 8, bottom: 0, trailing: 4)).font(.system(size: 12))
}
}
if let message = chatt.message {
Text(message).padding(EdgeInsets(top: 1, leading: 8, bottom: 0, trailing: 4)).font(.system(size: 14)).lineLimit(2, reservesSpace: true)
}
if let geodata = chatt.geodata {
Text("\(geodata.postedFrom)").padding(EdgeInsets(top: 0, leading: 8, bottom: 10, trailing: 4)).font(.system(size: 12)).lineLimit(2, reservesSpace: true)
}
}
.background {
Rectangle()
.fill(.ultraThinMaterial)
.cornerRadius(4.0)
}
.frame(width: 300)
}
}
Swipe left to view the geodata of all chatt
s
To recognize a swipe left gesture in MainView
, first add:
import MapKit
to your MainView.swift
file. Next, add the following properties to your MainView
struct:
@State private var cameraPosition: MapCameraPosition = .userLocation(fallback: .automatic)
@State private var selected: Chatt?
@State private var isMapping = false
then add the following modifiers to the List
call in MainView
:
.gesture(DragGesture(minimumDistance: 3.0, coordinateSpace: .local)
.onEnded { value in
if case (...0, -100...100) = (value.translation.width, value.translation.height) {
cameraPosition = .camera(MapCamera(
centerCoordinate: CLLocationCoordinate2D(latitude: LocManager.shared.location.coordinate.latitude, longitude: LocManager.shared.location.coordinate.longitude), distance: 500, heading: 0, pitch: 60))
isMapping.toggle()
}
}
)
.navigationDestination(isPresented: $isMapping) {
MapView(cameraPosition: $cameraPosition, chatt: selected)
}
.onAppear {
selected = nil
}
Prior to navigating to MapView
(by toggling isMapping
), we set the cameraPosition
to the user’s current location and set the distance
, heading
, and pitch
of the camera. When declaring the cameraPosition
property earlier, we initialized it to .userLocation()
because a @State
property must have an initial value. However, .userLocation()
doesn’t allow control of these three properties of the camera, hence we manually specify the camera position again here.
We use .navigationDestionation(isPresented:)
to navigate instead of .fullScreenCover(isPresented:)
to retain the back button when displaying the destination View. I couldn’t find any documentation to confirm the following, but from code tracing it looks the back button works by implicitly toggling the isPresented
flag, as one would expect. Hence we don’t need to manually toggle isMapping
in MapView
.
Recall that when MapView
is passed nil
for its chatt
parameter, it will display all chatt
s in the chatts
array, instead of displaying just an individual chatt
. We always set selected
to nil
when MainView
(re-)appears.
Viewing the geodata of a single chatt
To enable user to view the poster location of a single chatt
, we add a “location” button to each chatt
in the MainView
timeline, similar to how we added an “audio” button in the audio lab.
First add the following function to your MainView
:
func displayChatt(chatt: Chatt) {
selected = chatt
isMapping = true
}
and pass it to your instantiation of ChattListRow
in MainView
, along with a reference to the cameraPosition
:
ChattListRow(chatt: $0, displayChatt: displayChatt, cameraPosition: $cameraPosition)
TODO 1/1:
Update your definition of ChattListRow
so that it can be instantiated with the displayChatt()
function and the reference to cameraPosition
as we did above.
Similar to how we added an “audio” button in the ChattListRow
of the audio lab, add a “location” button next to the display of message
and geodata
information in ChattListRow
. But do this if and only if a chatt
has geodata information. Create an image with systemName
of "mappin.and.ellipse"
as the label
of the Button
.
When the user taps the button, set the cameraPosition
to the latitude and longitude as recorded in the chatt
’s geodata
property. Use the .camera()
method to do this as we did in the previous section, when handling the user swiping left. Set the distance
, heading
, and pitch
of the camera as we did above. This will center the camera on, and zoomed into, the poster’s location.
Then call the displayChatt()
function MainView
passed in to ChattListRow
with the chatt
of the row corresponding to where the user tapped the “location” button.
Don’t forget to add import MapKit
to the top of your ChattListRow.swift
file.
Keeping a State in the parent’s View and passing an update function to child Views, without providing the State as an environment object, is what is more commonly considered “state hoisting”. Passing an update function (or rather, pointer to it) to child Views has the added benefit that the pointer doesn’t change when the State changes, thereby saving child Views from re-rendering.
To recap, by the end of this lab, if you tap on the location button associated with each chatt
, you will see a map centered and zoomed in on the poster’s location, with a marker at the location. Swiping left on MainView
will bring you to the map view with all retrieved chatt
s shown as markers on the map and the map centered and zoomed in on the user’s current location. In both cases, tapping the location arrow button should pan and zoom onto the user’s current location.
Simulating locations
While running your project in Xcode, you can simulate location by clicking on the location arrw button in the debug console menu (screenshot) or access the feature from the main menu Debug > Simulate Location
. When you post a chatt
, or when you view all chatts
, the user’s current location should be the simulated location. You can select a different location in Xcode and it should again be reflected in Chatter
.
Submission guidelines
We will only grade files committed to the master
or main
branch. If you use multiple branches, please merge them all to the master/main branch for submission.
Ensure that you have completed the back-end part and have pushed your changes to your back-end code to your 441
GitHub repo.
Push your maps
lab folder to your GitHub repo as set up at the start of this spec.
git push
- Open GitHub Desktop and click on
Current Repository
on the top left of the interface - Click on your
441
GitHub repo - Add Summary to your changes and click
Commit to master
(orCommit to main
) - If you have a team mate and they have pushed changes to GitHub, you’ll have to click
Pull Origin
and resolve any conflicts - Finally click
Push Origin
to push changes to GitHub
Go to the GitHub website to confirm that your front-end files have been uploaded to your GitHub repo under the folder maps
. Confirm that your repo has a folder structure outline similar to the following. If your folder structure is not as outlined, our script will not pick up your submission, you will get ZERO point, and you will further have problems getting started on latter labs. There could be other files or folders in your local folder not listed below, don’t delete them. As long as you have installed the course .gitignore
as per the instructions in Preparing GitHub for EECS 441 Labs, only files needed for grading will be pushed to GitHub.
441
|-- # files and folders from other labs . . .
|-- maps
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
|-- # files and folders from other labs . . .
Verify that your Git repo is set up correctly: on your laptop, grab a new clone of your repo and build and run your submission to make sure that it works. You will get ZERO point if your lab doesn’t open, build, or run.
IMPORTANT: If you work in a team, put your team mate’s name and uniqname in your repo’s README.md
(click the pencil icon at the upper right corner of the README.md
box on your git repo) so that we’d know. Otherwise, we could mistakenly think that you were cheating and accidentally report you to the Honor Council, which would be a hassle to undo. You don’t need a README.md
if you work by yourself.
Review your information on the Lab Links sheet. If you’ve changed your teaming arrangement from previous lab’s, please update your entry. If you’re using a different GitHub repo from previous lab’s, invite eecs441staff@umich.edu
to your new GitHub repo and update your entry.
References
- Discover streamlined location updates
- AsyncStream
- Using AsyncSequence in Swift
- Meet MapKit for SwiftUI
- CLLocationCoordinate2D
- CLGeodocer
- CLPlacemark
- Creating Attributed Strings with Markdown
- AttributedStrings – Making Text More Beautiful Than Ever
- DragGesture
Prepared for EECS 441 by Wendan Jiang, Alexander Wu, Benjamin Brengman, Ollie Elmgren, Nowrin Mohamed, Yibo Pi, and Sugih Jamin | Last updated: September 14th, 2024 |