I am an associate professor of Computer Engineering at Koç University in Istanbul working at the Artificial Intelligence Laboratory. Previously I was at the MIT AI Lab and later co-founded Inquira, Inc. My research is in natural language processing and machine learning. For prospective students here are some research topics, papers, classes, blog posts and past students.
Koç Üniversitesi Bilgisayar Mühendisliği Bölümü'nde öğretim üyesiyim ve Yapay Zeka Laboratuarı'nda çalışıyorum. Bundan önce MIT Yapay Zeka Laboratuarı'nda çalıştım ve Inquira, Inc. şirketini kurdum. Araştırma konularım doğal dil işleme ve yapay öğrenmedir. İlgilenen öğrenciler için araştırma konuları, makaleler, verdiğim dersler, Türkçe yazılarım, ve mezunlarımız.

August 25, 2017

Relational Symbol Grounding through Affordance Learning: An Overview of the ReGround Project

Antanas, Laura et al. Grounding Language Understanding (GLU 2017) ISCA Satellite Workshop of Interspeech 2017. (PDF, PPT)

Abstract: Symbol grounding is the problem of associating symbols from language with a corresponding referent in the environment. Traditionally, research has focused on identifying single objects and their properties. The ReGround project hypothesizes that the grounding process must consider the full context of the environment, including multiple objects, their properties, and relationships among these objects. ReGround targets the development of a novel framework for “affordance grounding”, by which an agent placed in a new environment can adapt to its new setting and interpret possibly multi-modal input in order to correctly carry out the requested tasks.


Full post...

August 03, 2017

FaceBook'un yapay zeka programı dünyayı ele geçirmeyi düşünmüyor

Son zamanlarda Facebook'un bir yapay zeka çalışması ile ilgili çıkan sansasyonel haberlerin gerçekle pek ilgisi yok:
Full post...

July 26, 2017

Parsing with context embeddings

Ömer Kırnap, Berkay Furkan Önder and Deniz Yuret. In Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, Vancouver, 2017. (PDF, poster, presentation, related posts).

Abstract. We introduce context embeddings, dense vectors derived from a language model that represent the left/right context of a word instance, and demonstrate that context embeddings significantly improve the accuracy of our transition based parser. Our model consists of a bidirectional LSTM (BiLSTM) based language model that is pre-trained to predict words in plain text, and a multi-layer perceptron (MLP) decision model that uses features from the language model to predict the correct actions for an ArcHybrid transition based parser. We participated in the CoNLL 2017 UD Shared Task as the ``Koç University'' team and our system was ranked 7th out of 33 systems that parsed 81 treebanks in 49 languages.


Full post...

May 23, 2017

JuliaCon 2017, Berkeley, June 20-24

I gave a talk at JuliaCon introducing Knet on Wednesday, June 21, 2017, 4:16pm - 4:52pm, East Pauley Pauley Ballroom, Berkeley, CA. See these related posts.
Full post...

May 17, 2017

Congratulations to the Koç parsing team

Our neural net based dependency parser was number 7 overall out of 33 teams participating in the CoNLL 2017 Shared Task "Multilingual Parsing from Raw Text to Universal Dependencies" in which participating teams had to parse 68 corpora in 50 languages. I would like to thank Ömer Kırnap and Berkay Furkan Önder for their contributions and all-nighters.
Full post...