Offloading resource-hungry tasks from mobile devices to an edge server has been explored recently to improve task completion time as well as save battery energy. The low latency computing resource from edge servers are a perfect companion to realize such task offloading. However, edge servers may suffer from unreliable performance due to its rapid workload variation and reliance on intermittent renewable energy. Further, batteries in mobile devices make online optimum offloading decisions challenging since it intertwines offloading decisions across different tasks. In this paper, we propose a deep Q-learning based task offloading solution, DeepTO, for online task offloading. DeepTO learns edge server performance in a model-free manner and takes future battery needs of the mobile device into account. Using a simulation-based evaluation, we show that DeepTO can perform close to the optimum solution that has complete future knowledge.