BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Berkeley Graduate Division - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Berkeley Graduate Division
X-ORIGINAL-URL:https://grad.berkeley.edu
X-WR-CALDESC:Events for Berkeley Graduate Division
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20230312T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20231105T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20240310T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20241103T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20240411T100000
DTEND;TZID=America/Los_Angeles:20240411T130000
DTSTAMP:20260406T210836
CREATED:20240403T202101Z
LAST-MODIFIED:20240403T202101Z
UID:44120-1712829600-1712840400@grad.berkeley.edu
SUMMARY:Python Text Analysis: Word Embeddings
DESCRIPTION:How can we use neural networks to create meaningful representations of words? The bag-of-words is limited in its ability to characterize text\, because it does not utilize word context. In this part\, we study word embeddings\, which were among the first attempts to use neural networks to develop numerical representations of text that incorporate context. We learn how to use the package gensim to construct and explore word embeddings of text.
URL:https://grad.berkeley.edu/event/python-text-analysis-word-embeddings/
LOCATION:Online via Zoom
CATEGORIES:Professional Development Events
END:VEVENT
END:VCALENDAR