1:设置mysql原生分页
# 监控系统首页显示分页 def MyPagination(limitid,offsetid): limitid =str(limitid) offsetid =str(offsetid)
# 这里是mysql原生代码 show_goods = "select dal_keywordtable.* from (select asin,min(ranking) as minRanking from `dal_keywordtable` GROUP BY asin) t2 LEFT JOIN dal_keywordtable on t2.asin = dal_keywordtable.asin and t2.minRanking = dal_keywordtable.ranking limit " + limitid + offset " + offsetid + ;" return show_goods 用于测试调用 ret = MyPagination(1,2) print(ret) 返回结果是: # 这里只是确认下参数是否被调用进去 select dal_keywordtable.* from (select asin,min(ranking) as minRanking from `dal_keywordtable` GROUP BY asin) t2 LEFT JOIN dal_keywordtable on t2.asin = dal_keywordtable.asin and t2.minRanking = dal_keywordtable.ranking limit 1 offset 2; 从第1个取 取2个
2:GET请求连接上找到传入的limit与offset:
from utils import orm 导入上面的分页连接,带参数调用这个方法 从第offsetid开始取,取limitid个 limitid = request._request.GET.get(limit") offsetid = request._request.GET.get(offset) raw_Pagination =MyPagination(limitid,offsetid) obj_rawqueryset = models.KeyWordTable.objects.raw(raw_Pagination) # 得到
3:序列化查询的RawQueryset类型,返回json格式:
序列化 json_data = {} data_list = [] for obj in obj_rawqueryset: data = {} 要在遍历里面创建字典用于存数据 (obj) data[goods_title"] = obj.goods_title data[src obj.src data[single_keyword obj.single_keyword data[asin obj.asin data[seed_asin obj.seed_asin data[ranking obj.ranking data[position obj.position data[time"] = obj.time.strftime('%Y-%m-%d') data_list.append(data) json_data[data'] = data_list Response(json_data) 返回给前端的json格式: { : [ { ": Sloggers Women's Waterproof Rain and Garden Shoe with Comfort Insole,Blue Swirls,Size 6,Style 5117SWL06,https://images-na.ssl-images-amazon.com/images/I/41CSEL0bHeL._SS36_.jpg: null,1)">B077TDGR51B01H7X9ASA": 6666662019-12-02 },{ S1loggers Women's Waterproof Rain and Garden Shoe with Comfort Insole,1)">B077TDGR53B01H7X9ASBS2loggers Women's Waterproof Rain and Garden Shoe with Comfort Insole,1)">B077TDGR52B01H7X9ASC": 22019-12-03 } ] }
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。